Solved – Intercept increases in regression when adding explanatory variables

I am conducting an analysis, where I examine the size of the intercepts of three regression models (time-series). The models look something like this:

$y_1=alpha+beta_1x_1+varepsilon$

$y_2=alpha+beta_1x_1+beta_2x_2+beta_3x_3+varepsilon$

$y_3=alpha+beta_1x_1+beta_2x_2+beta_3x_3+beta_4x_4+beta_5x_5+varepsilon$

When I run the regressions, I then examine values of $alpha$ and $t(alpha)$ for all three models. I find that as I add factors (going form model 1 to 3), the $R^2$ increases (as expected), so it seems that the added factors add some explanation. However, I find that the $t(alpha)$ values increase as I add factors.

Maybe this is something that I am missing, but shouldn't $t(alpha)$ decrease as I add factors, since more of the explanatory output is now explained by the factors, and is not put on the intercept?

Edit: $t(alpha)$ is the estimated $t-$statistic of the $alpha$ intercept from the regression output.

It's not true that adding predictors should generally cause the estimate of the intercet $alpha$ to decrease.

The intercept is the predicted $y$ value when all the $x$ predictors are equal to 0. So adding new predictors can cause the intercept to increase or decrease, by pretty much any amount, based on the mean of the $x$ predictor you're adding and the size/direction of its corresponding regression coefficient.

This becomes especially clear when we note that we can write the estimate for the intercept as $$ hat{alpha} = bar{y} – hat{beta_1}bar{X_1} – hat{beta_2}bar{X_2} – dots, $$ where $bar{y}$ denotes the sample mean of $y$, $hat{beta_j}$ is the sample estimate of $beta_j$, and $bar{X_j}$ is the sample mean of $X_j$.

Similar Posts:

Rate this post

Leave a Comment