Solved – Proving the consistency of this OLS estimator for $hatbeta_1$

So in this particular linear regression model we are given that $beta_0=0$.

The goal is to find the estimator, $hatbeta_1$, and show that it is consistent.

I managed to find $hatbeta_1$ as follows:



$sum u_{i}^{2}=sum (Y_i-hat{{beta}_i}X_i)^2$

$frac{delta sum u_i^2}{deltahat{beta_1}}=frac{delta sum (Y_i-hat{{beta}_i}X_i)^2}{deltahat{beta_1}} leftarrow 0$

$sum (-2X_i)(Y_i-X_ihatbeta_1)=0$

$sum X_iY_i-hatbeta_1sum X_i^2=0$

$frac{sum Y_i}{sum X_i}=hatbeta_1$

To be consistent, I know that $Pr(left|hatbeta_1-beta_1right| leq c)=1$ must be the case

Hence I get

$Pr(left|frac{sum Y_i}{sum X_i} – frac{Y_i}{X_i}right| leq c)=1$ must be the case.

This is where I am stuck. How do I go from here to show that the estimator is consistent?

Is this even the right approach to begin with? Thanks.

The condition for consistency is actually $$Prleft(left|frac{sum_i Y_i}{sum_i X_i}-beta_1right|>cright)to 1$$

That is, it's $beta_1$ that the ratio needs to be close to, and the probability needs to be 1 only asymptotically. Also, it's relatively unusual that you need to prove consistency directly in that way; usually you can show your statistic is made up of pieces that you know are consistent and argue that the whole thing is consistent.

Instead of sums in the fraction defining $hatbeta_1$, think of means $$hatbeta_1 = frac{frac{1}{n}sum_i Y_i}{frac{1}{n}sum_i X_i}$$ That's the same, since the $n$s cancel from the top and bottom, but we have the Law of Large Numbers for means.

As long as the distribution of $Y|X$ has a finite mean, the numerator converges in probability (or almost surely) to $E[Y]$.

What happens with the denominator depends on what you're assuming about $X$. If $X$ are random, then the law of large numbers applies, and the denominator converges in probability (or almost surely) to $E[X]$.

So, the ratio converges to $E[Y]/E[X]$. According to the model, $E[Y|X=x]=beta_1 x$, so $E[Y]=E[E[Y|X=x]]=beta_1E[X]$, so $E[Y]/E[X]=beta_1$ and we are done.

Similar Posts:

Rate this post

Leave a Comment