As a means of motivating the question, consider a regresison problem where we seek to estimate $Y$ using observed variables ${ a, b }$

When doing multivariate polynomial regresison, I try to find the optimal paramitization of the function

$$f(y)=c_{1}a+c_{2}b+c_{3}a^{2}+c_{4}ab+c_{5}b^{2}+cdots$$

which best fit the data in a least squared sense.

The problem with this, however, is that the parameters $c_i$ are not independent. Is there a way to do the regression on a different set of "basis" vectors which are orthogonal? Doing this has many obvious advantages

1) the coefficients are no longer correlated.

2) the values of the $c_i$'s themselves no longer depend on the degree of the coefficients.

3) This also has the computational advantage of being able to drop the higher order terms for a coarser but still accurate approximation to the data.

This is easily achieved in the single variable case using orthogonal polynomials, using a well studied set such as the Chebyshev Polynomials. It's not obvious however (to me anyway) how to generalize this! It occured to me that I could mutiply chebyshev polynomials pairwise, but I'm not sure if that is the mathematically correct thing to do.

Your help is appreciated

**Contents**hide

#### Best Answer

For completion's sake (and to help improve the stats of this site, ha) I have to wonder if this paper wouldn't also answer your question?

ABSTRACT:We discuss the choice of polynomial basis for approximation of uncertainty propagation through complex simulation models with capability to output derivative information. Our work is part of a larger research effort in uncertainty quantification using sampling methods augmented with derivative information. The approach has new challenges compared with standard polynomial regression. In particular, we show that a tensor product multivariate orthogonal polynomial basis of an arbitrary degree may no longer be constructed. We provide sufficient conditions for an orthonormal set of this type to exist, a basis for the space it spans. We demonstrate the benefits of the basis in the propagation of material uncertainties through a simplified model of heat transport in a nuclear reactor core. Compared with the tensor product Hermite polynomial basis, the orthogonal basis results in a better numerical conditioning of the regression procedure, a modest improvement in approximation error when basis polynomials are chosen a priori, and a significant improvement when basis polynomials are chosen adaptively, using a stepwise fitting procedure.

Otherwise, the tensor-product basis of one-dimensional polynomials is not only the appropriate technique, but also the *only* one I can find for this.

### Similar Posts:

- Solved – Best basis set for polynomial expansion
- Solved – If you can’t do it orthogonally, do it raw (polynomial regression)
- Solved – Polynomial Chebyshev Regression versus multi-linear regression
- Solved – Is spline basis orthogonal
- Solved – Optimal orthogonal polynomial chaos basis functions for log-normally distributed random variables