Solved – corresponding bias-variance decomposition of MSE for vectors

I know from standard theory that the bias-variance decomposition for Mean Squared Error is (for an estimator $hat{mu}$ of $mu$):

Eleft[left(hat{mu}-muright)^2right] = Var(hat{mu}) + left(Eleft[muright]-muright)^2 = Var(hat{mu}) +Bias(hat{mu}, mu)^2

However, if today we are talking about vectors, where $boldsymbol{hat{mu}}$ is an estimator of $boldsymbol{mu}$, both of which are $ntimes 1$ vectors, I was wondering if there is a corresponding nice decomposition as in the scalar case above for:



Simply note that

$$|| widehat{mu} – mu ||^2 = sumlimits_{i = 1}^{n} (widehat{mu}_{i} – mu_{i})^2$$

Then, the answer is given by the decomposition you gave earlier:

$$ mathbb{E}[(widehat{mu}_{i} – mu_{i})^2] = Var[widehat{mu}_{i}] + [Bias(widehat{mu}_{i}, mu_{i} )]^2 $$

Summing all up, we get

$$ mathbb{E}[||widehat{mu} – mu||^2] = sumlimits_{i = 1}^{n} Var[widehat{mu}_{i}] + [Bias(widehat{mu}_{i}, mu_{i} )]^2 $$

Another issue, totally different, is the covariance matrix $mathbb{E}[(widehat{mu} – mu)(widehat{mu} – mu)^t]$.

Similar Posts:

Rate this post

Leave a Comment