How would you go about explaining "Stambaugh Bias" in simple relatively non-technical language?
I'm not sure you can explain this term without using some technical terms, unfortunately. I'll give it my best shot.
Some definitions first:
- Bias: the difference between the expectation of an estimator and the true value of the parameter you're estimating.
- OLS: Ordinary Least Squares; a method for solving a regression problem.
- Autoregressive process (AR): (via Wikipedia)
Stambaugh bias occurs when you perform regression on a lagged stochastic input. Essentially, when you do this, you have to use an estimate for the input (regressor), which requires estimating autocorrelation coefficients. The bias in the autocorrelation coefficients is then proportional to the bias in the slope coefficient's estimate from the OLS. You can correct for this if you know that your method for computing autocorrelation coefficients is biased.
The original paper really isn't too complicated, so long as you know both what an AR process is and how OLS regression works: Paper.