### An optimization problem: variation

#### by Willie Wong

Examining the theorem proven in the previous post, we are led naturally to ask whether there are higher order generalizations.

**Question:** Let with . What can we say about the minimizer of where ranges over degree polynomials?

It is pretty easy to see that we expect to intersect at the maximum number of points, which is . We label those points and call and . Then the cost function can be written as

Since we know that values of at the points we can write down the interpolation polynomial explicitly using Sylvester’s formula:

The partial derivatives are now

It remains to compute for . We observe that when

and also

So

Now, we observe that

so after some computation we arrive at

which we can further simplify to

Now, since and cross transversely at , the difference of their derivatives is non-zero. (This harks back to our assumption that .) So we are down, as in the case where , to equations entirely independent of .

More precisely, we see that the stationarity condition becomes the choice of such that the integrals

for each . Since form a basis for the polynomials of degree at most , we have that the function

is orthogonal to every polynomial of degree at most . So in particular the are solutions to the following system of equations

From symmetry considerations we have that . This also kills about half of the equations. For the low we have