Let ${X_i}_{igeq 1}$ be IID with finite second moment, and
$$
Y_n = frac{2}{n(n+1)}sum_{i=1}^n ,icdot X_i , , qquad ngeq 1 , .
$$
Could you please tell me how can I show that $Y_n$ converges in probability to $mathrm{E}[X_1]$?
I'm thinking Kolmogorov Convergence criterion. But seems like I cannot prove it using that. Any suggestion?
Best Answer
Actually, we can even show that $mathbb E|Y_n-mathbb E[X_1]|^2to 0$. Indeed, since $sum_{j=1}^nj=n(n+1)/2$ and $mathbb E[X_j]=mathbb E[X_1]$ for all $j$, $$Y_n-mathbb E[X_1]=frac 2{n(n+1)}sum_{j=1}^nj(X_j-mathbb E[X_j]),$$ hence $$tag{1}mathbb E|Y_n-mathbb E[X_1]|^2=frac 4{n^2(n+1)^2}sum_{i,j=1}^n ijmathbb Eleft[(X_i-mathbb E[X_i])(X_j-mathbb E[X_j])right].$$ If $ineq j$, then by independence $mathbb Eleft[(X_i-mathbb E[X_i])(X_j-mathbb E[X_j])right]=0$ and plugging it in (1), $$tag{2}mathbb E|Y_n-mathbb E[X_1]|^2=frac 4{n^2(n+1)^2}sum_{j=1}^n j^2mathbb Eleft[(X_j-mathbb E[X_j])^2right].$$ Using now the fact that $X_j$ has the same distribution as $X_1$ and bounding $sum_{j=1}^nj^2$ by $n^2(n+1)$, equality (2) becomes $$mathbb E|Y_n-mathbb E[X_1]|^2leqslantfrac 4{n+1}mathbb Eleft[(X_0-mathbb E[X_0])right]^2$$ and we are done.
Similar Posts:
- Solved – Variance of Bernoulli Random Variable with a Random Variable as parameter
- Solved – Convergence in probability of $frac{1}{n}sum_{i=1}^n X_i^2$ when $X_i$’s are i.i.d $N(0,1)$
- Solved – Probability of product of two random variables
- Solved – Proving basic probability inequalities
- Solved – Convergence of expectation