Solved – Convergence of eigenvectors and eigenvalues of matrix that converges

For each random variable $X=x$, there is a symmetric positive definite matrix $M(x)$. Suppose there is a set of samples of random matrix $M_1,M_2,…,M_n$, where each $M_i$ is calculated based on the associated random sample $X=x_i$. Let $hat{M}_n=frac{1}{n}sumlimits_{i=1}^n M_i$, there is the convergence result that $hat{M}_nrightarrow E[M]$ at some rate relating to $n$, i.e. $hat{M}_n-E[M]=O_p(epsilon_n)$, where the convergence rate $epsilon_n$ depends on the sample size $n$.

Suppose we have eigen-decomposition for the two matrices that $hat{M}_n=hat{V}_nhat{Lambda}_nhat{V}_n^{T}$ and $E[M]=VLambda V^{T}$, may I ask whether there are some convergence relationship between $hat{V}_n$, $hat{Lambda}_n$ and $V$, $Lambda$ respectively?

Many thanks!

Jan Magnus has his book on matrix calculus available online. Section 8.7, page 177 and on, deals with differentials in eigenproblems. See if you can apply the delta method and carry through the big $O$ and small $o$ through those differentials. Without knowing your convergence rates, though, I don't really know if your question could be answered. He discusses the various pathological situations that have to deal with multiple eigenvalues, so if you have something converging say to $lambda=1$ at different rates, you may have something real weird in your eigenvalues.

In a situation when all eigenvalues are 1, there exist a non-degenerate asymptotic distribution of sample eigenvalues, see e.g. Johnstone (2001) or the original probability theory literature (Marcenko-Pastur semicircle law) he cites. You are probably familiar with that literature, and if that stuff is relevant, this could be indicated in your question.

Similar Posts:

Rate this post

Leave a Comment