I've worked the slope all the way down to $sum [x_i(y_i – bar{y})] = hatbeta_1 sum[x_i(x_i – bar{x})]$

But I can not figure out how to show the steps for:

$sum[x_i(y_i – bar{y})] = sum(x_i – bar{x})(y_i – bar{y})$

and

$hat beta_1 sum[x_i(x_i – bar{x})] = hat beta_1 sum(x_i – bar{x})^2$

**Contents**hide

#### Best Answer

Add and subtract and then look whether the unwanted term disappears:

$$sum[x_i(y_i-bar y)]=sum[(x_i-bar x+bar x)(y_i-bar y)]=sum[(x_i-bar x)(y_i-bar y)]+sum [bar x(y_i-bar y)].$$

Now

$$sum [bar x(y_i-bar y)]=bar xsum [y_i-bar y]=bar xleft[sum y_i-nbar yright]=0,$$

where $n$ is the number of the terms in the sum. Apply the same trick for the second equation.

### Similar Posts:

- Solved – Proof of MSE is unbiased estimator in Regression
- Solved – expected value of error term equals zero, formal proof
- Solved – How to obtain the standard error for a slope at a given data point, for curvilinear regression
- Solved – Residual plot with a slope of 1
- Solved – Proving the consistency of this OLS estimator for $hatbeta_1$