I am trying to test if the intercepts of two linear regressions models differ significantly.

I have return data of two portfolios (same time period and both monthly) and regressed each of them on the same independent variables (Fama/French factors for this time period). Now I have two different intercepts.

My question is, how do I test if these differ significantly?

This is probably fairly easy. I found this questions, which sounds pretty much like the thing I want to do, but I am not really sure if a Chow test works in this situation. I also found multiple other questions that are comparable but since different tests were recommended I am not sure which one fits (best).

I looked at this and this but those are probably more difficult than what needs to be done in my case.

**Contents**hide

#### Best Answer

Create a time-series of the difference in returns between the two portfolios, that is, Ra-Rb for each time period. Now regress this as dependent variable against the Fama-French factors as independent variables. The intercept in the regression results will be the difference in excess returns, while its t-statistic will indicate the significance of the difference in excess returns. Incidentally, you will get the same answer as taking the difference in intercepts from regressing each portfolio's return individually, however the suggested method allows you to assess the significance of the difference.

### Similar Posts:

- Solved – Why is a regression model of portfolio return giving smaller adjusted R-square (i.e., negative) than expected
- Solved – How does Cornish-Fisher VaR (aka modified VaR) scale with time
- Solved – Difference between geometric and arithmetic mean
- Solved – Testing Sharpe Ratio significance
- Solved – Testing Sharpe Ratio significance