Consider the criterion function for ordinary least squares

$$

S(b)=(Y-X'b)'(Y-X'b)

$$

with Y, a matrix of dependent variables, and X, a matrix of explanatory variables. It is of course known that:

$$

frac{partial S}{partial beta} = -2cdot X'Y+2cdot X'Xcdot b

$$

solving for $b$ yields the OLS estimator for $b$.

Now if we think of $sigma^2$, the variance of the residuals, as a parameter to be estimated does it then make sense to

$$

frac{partial S}{partial sigma^2} ?

$$

if so what is it?

**Contents**hide

#### Best Answer

As usually $Ysim mathcal{N}(X'b,sigma^2 I_n)$, you can write $$S(b)=S(b,sigma)=(Y-X'b)'(Y-X'b)=epsilon'epsilon$$ with the vector of residuals $epsilon sim mathcal{N}(0,sigma^2 I_n)$ and their standardized version $sigma^{-1}epsilon=epsilon_0 sim mathcal{N}(0,I_n)$.

So $$frac{partial S}{partial sigma^2}=(Y-X'b)'(Y-X'b)=frac{epsilon'epsilon}{sigma^2}={epsilon_0}'epsilon_0.$$

Note by the way that $$E(frac{epsilon'epsilon}{sigma^2})=E({epsilon_0}'epsilon_0)=n,$$ where $n$ is the length of $Y$ or $epsilon$. This result holds even if the residuals are not Gaussian but at least centered and their variance is parametrized by $sigma^2$.

### Similar Posts:

- Solved – Variance of residuals vs. MLE of the variance of the error term
- Solved – Deriving SSE of Simple Linear Regression is $chi^{2}$
- Solved – Weighted least-squares negative fitted values
- Solved – Weighted least-squares negative fitted values
- Solved – How to differentiate the distribution function of lognormal distribution with respect to its parameters