Solved – First order condition of sum of squares with respect to variance of residuals

Consider the criterion function for ordinary least squares
$$
S(b)=(Y-X'b)'(Y-X'b)
$$
with Y, a matrix of dependent variables, and X, a matrix of explanatory variables. It is of course known that:
$$
frac{partial S}{partial beta} = -2cdot X'Y+2cdot X'Xcdot b
$$
solving for $b$ yields the OLS estimator for $b$.

Now if we think of $sigma^2$, the variance of the residuals, as a parameter to be estimated does it then make sense to
$$
frac{partial S}{partial sigma^2} ?
$$
if so what is it?

As usually $Ysim mathcal{N}(X'b,sigma^2 I_n)$, you can write $$S(b)=S(b,sigma)=(Y-X'b)'(Y-X'b)=epsilon'epsilon$$ with the vector of residuals $epsilon sim mathcal{N}(0,sigma^2 I_n)$ and their standardized version $sigma^{-1}epsilon=epsilon_0 sim mathcal{N}(0,I_n)$.

So $$frac{partial S}{partial sigma^2}=(Y-X'b)'(Y-X'b)=frac{epsilon'epsilon}{sigma^2}={epsilon_0}'epsilon_0.$$

Note by the way that $$E(frac{epsilon'epsilon}{sigma^2})=E({epsilon_0}'epsilon_0)=n,$$ where $n$ is the length of $Y$ or $epsilon$. This result holds even if the residuals are not Gaussian but at least centered and their variance is parametrized by $sigma^2$.

Similar Posts:

Rate this post

Leave a Comment