I'm trying to prove the efficiency of OLS over GLS when the covariance matrix of the error $varepsilon$ is mistakenly assumed to be $sigma^2Sigma$ instead of $sigma^2 I$. After deriving the variances, what I have so far is: $Var(beta^{ols}) = sigma^2(X'X)^{-1}$ and
$Var(beta^{gls}) = sigma^2(X'Sigma^{-1} X)^{-1}$.
Want to show $Var(beta^{gls}) – Var(beta^{ols})$ is psd.
$$implies sigma^2(X'Sigma^{-1} X)^{-1} – sigma^2(X'X)^{-1} geq 0\
implies (X'Sigma^{-1} X)^{-1} – (X'X)^{-1} geq 0 \
iff X'X – X'Sigma^{-1} X geq 0
$$
But this is where I don't know how to proceed. First I tried $X'(I-Sigma^{-1})X geq 0$ but got stuck. Any hints on how to continue?
Update:
After one of the comments, I want to double check the variance of $beta^{gls}$:
$$Var(beta^{gls}) = Var[(X'Sigma^{-1}X)^{-1} X'Sigma^{-1}varepsilon]\
=(X'Sigma^{-1}X)^{-1} X'Sigma^{-1} Var(varepsilon) Sigma^{-1} X(X'Sigma^{-1}X)^{-1}
$$
Here's is where I think I made the mistake, I replaced the variance of $varepsilon$ as the one that we think is "right" (i.e. $sigma^{2}Sigma) $. Should I replace it with the real one (i.e. $sigma^2I$)?
Best Answer
This is what I ended up doing: $$Var(beta^{gls}) = Var[(X'Sigma^{-1}X)^{-1} X'Sigma^{-1}varepsilon]\ =(X'Sigma^{-1}X)^{-1} X'Sigma^{-1} Var(varepsilon) Sigma^{-1} X(X'Sigma^{-1}X)^{-1}\ = sigma^2 (X'Sigma^{-1}X)^{-1} X'Sigma^{-1} Sigma^{-1} X(X'Sigma^{-1}X)^{-1}\ $$
Want to show $Var(beta^{gls}) – Var(beta^{ols})$ is psd. $$implies sigma^2 (X'Sigma^{-1}X)^{-1} X'Sigma^{-1} Sigma^{-1} X(X'Sigma^{-1}X)^{-1} – sigma^2(X'X)^{-1} geq 0\ iff X'X – X'Sigma^{-1}X( X'Sigma^{-1} Sigma^{-1} X)^{-1}X'Sigma^{-1}X geq 0\ implies X'(I-Sigma^{-1}X( X'Sigma^{-1} Sigma^{-1} X)^{-1}X'Sigma^{-1})X geq 0\ implies X'MXgeq0 $$ Where $M$ is the residual maker matrix, and since $M$ is psd, the statement holds.
Similar Posts:
- Solved – how to prove that $hat sigma^2$ is a consistent for $sigma^2$
- Solved – How to prove nonstationarity of a random walk
- Solved – How to prove nonstationarity of a random walk
- Solved – Best OLS estimators
- Solved – How to proof relationship between inverse covariance matrix and linear regression coefficients