I have a VAR model, which shows very low $R^2$ values (below 0.05). Does this mean that my model is very bad in explaining?

**Contents**hide

#### Best Answer

The low $R^2$ is a result of your choice of using differences in interest rates. However, $R^2$ is not a particularly indicative of whether you have a good model or not. Estimating a VAR model on interest rate levels rather than differences would give you a much higher $R^2$, but the higher $R^2$ is not strictly evidence of having a better model (though the model is better because it incorporates the mean-reversion and cointegration effects that your VAR model in differences ignores). In economic and financial data it is common to see approximately AR(1) data with high $R^2$ values and then when differencing the $R^2$ value declines. Sometimes it is appropriate to difference, but when there is mean-reversion (or cointegration), then not accounting for those effects can lead to bad forecasts. An error-correction model would provide a more accurate $R^2$ than a VAR in levels would and would provide the same forecasts.

Also, rather than imposing a spread in the model, it is often better to let the statistical model do the hard work of determining the appropriate relationship. The cointegrating vectors from an ECM are better to use than typical spreads. Just using the spreads is kind of like you imposing your own prior on the cointegrating vectors (but ignoring the rest of the Bayesian framework).

### Similar Posts:

- Solved – Do low $R^2$ values mean that the vector autoregressive model is bad
- Solved – Do low $R^2$ values mean that the vector autoregressive model is bad
- Solved – Estimate time for mean reversion of two time series
- Solved – Estimate time for mean reversion of two time series
- Solved – Estimate time for mean reversion of two time series