# Solved – Multivariate orthogonal polynomial regression

As a means of motivating the question, consider a regresison problem where we seek to estimate \$Y\$ using observed variables \${ a, b }\$

When doing multivariate polynomial regresison, I try to find the optimal paramitization of the function

\$\$f(y)=c_{1}a+c_{2}b+c_{3}a^{2}+c_{4}ab+c_{5}b^{2}+cdots\$\$

which best fit the data in a least squared sense.

The problem with this, however, is that the parameters \$c_i\$ are not independent. Is there a way to do the regression on a different set of "basis" vectors which are orthogonal? Doing this has many obvious advantages

1) the coefficients are no longer correlated.
2) the values of the \$c_i\$'s themselves no longer depend on the degree of the coefficients.
3) This also has the computational advantage of being able to drop the higher order terms for a coarser but still accurate approximation to the data.

This is easily achieved in the single variable case using orthogonal polynomials, using a well studied set such as the Chebyshev Polynomials. It's not obvious however (to me anyway) how to generalize this! It occured to me that I could mutiply chebyshev polynomials pairwise, but I'm not sure if that is the mathematically correct thing to do.

Contents