I know from standard theory that the bias-variance decomposition for Mean Squared Error is (for an estimator $hat{mu}$ of $mu$):
$$
Eleft[left(hat{mu}-muright)^2right] = Var(hat{mu}) + left(Eleft[muright]-muright)^2 = Var(hat{mu}) +Bias(hat{mu}, mu)^2
$$
However, if today we are talking about vectors, where $boldsymbol{hat{mu}}$ is an estimator of $boldsymbol{mu}$, both of which are $ntimes 1$ vectors, I was wondering if there is a corresponding nice decomposition as in the scalar case above for:
$$
Eleft[||boldsymbol{hat{mu}}-boldsymbol{mu}||^2right]
$$
?
Thanks!
Best Answer
Simply note that
$$|| widehat{mu} – mu ||^2 = sumlimits_{i = 1}^{n} (widehat{mu}_{i} – mu_{i})^2$$
Then, the answer is given by the decomposition you gave earlier:
$$ mathbb{E}[(widehat{mu}_{i} – mu_{i})^2] = Var[widehat{mu}_{i}] + [Bias(widehat{mu}_{i}, mu_{i} )]^2 $$
Summing all up, we get
$$ mathbb{E}[||widehat{mu} – mu||^2] = sumlimits_{i = 1}^{n} Var[widehat{mu}_{i}] + [Bias(widehat{mu}_{i}, mu_{i} )]^2 $$
Another issue, totally different, is the covariance matrix $mathbb{E}[(widehat{mu} – mu)(widehat{mu} – mu)^t]$.
Similar Posts:
- Solved – How does R compute r.squared for weighted least squares
- Solved – Estimate the sum of predicted variables by a linear model in R
- Solved – How to derive the covariance matrix of $hatbeta$ in linear regression
- Solved – How to derive the covariance matrix of $hatbeta$ in linear regression
- Solved – How do we derive the OLS estimate of the variance