Casella and Lerner's *Theory of Point Estimation* (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

Theorem 8.21 (Multivariate CLT)Let $mathbf{X}_nu = (X_{1nu}, dots, X_{r nu}$) be iid with mean vector $zeta = (zeta_1, dots, zeta_r)$ and covariance matrix $Sigma = vert vert sigma_{ij} vert vert$, and let $overline{X}_{in} = (X_{i1} + dots + X_{in})/n$. Then,

$$[ sqrt{n} (overline{X}_{1n} – zeta_1), dots, sqrt{n} (overline{X}_{rn} – zeta_r)]$$

tends in law to the multivariate normal distribution with mean vector $mathbf{0}$ and covariance matrix $Sigma$.

What would be its derivation?

**Contents**hide

#### Best Answer

There are many proofs of the (many versions of) the CLT. In the iid case you mention, usual proof is based on characteristic functions. There is a discussion here Proofs of the central limit theorem.

The proof is basically the same for the multivariate case as the univariate case, mostly some changes in notation. There is basically no new necessary ideas for the multivariate case. Some ideas: If $X_i$ are iid $n$-dim random vectors (such that expectation and covariance matrix exists.) If we know the distribution of $a^T X_i$ for all (constant, non-random) vectors $a$, then the distribution of $X_i$ are characterized by that. So we can use the uni-variate CLT on $a^TX_i$ and reconstruct the limit from that.

### Similar Posts:

- Solved – constant matrix times Multivariate Gaussian distribution
- Solved – constant matrix times Multivariate Gaussian distribution
- Solved – constant matrix times Multivariate Gaussian distribution
- Solved – Central Limit Theorem for random vectors under weak dependence
- Solved – Variance of a multivariate AR(1) process