Is it possible to calculate AIC or BIC for nonlinear regression models like SVM, regression trees, artificial neural network, and others. AIC and BIC can be estimated from linear models, but I have not seen AIC and BIC being computed for these nonlinear regression models. So, wondering if anyone can provide their opinion with some examples? Thanks.

**Contents**hide

#### Best Answer

The AIC and BIC are both functions of the likelihood. Any model that is fit by maximum likelihood has a straightforward AIC and/or BIC. Some models that are fit with a penalized likelihood can also provide AIC. For example, a generalized additive model provides AIC by counting effective degrees of freedom rather than parameters, and the maximum of the penalized likelihood. I suppose this could be done with ridge regression as well.

One way to think of a regression tree is as a linear regression on dummy variables that indicate which partition the data falls into. If you recast your tree that way, then there is nothing stopping you from fitting it with least squares and then computing AIC the normal way.

But why would you do this when packages like `rpart`

automatically cross-validate for you? Cross-validation is almost always better than AIC for model selection. Other models could probably be shoehorned into a framework that makes AIC computation feasible. But the question will generally be: why?

### Similar Posts:

- Solved – Explain ridge in the log-likelihood for Logistic Regression classifier
- Solved – How to choose between linear or nonlinear mixed model
- Solved – Why are confidence intervals and p-values not reported as default for penalized regression coefficients
- Solved – Why are confidence intervals and p-values not reported as default for penalized regression coefficients
- Solved – Selection of k knots in regression smoothing spline equivalent to k categorical variables