Solved – RMSE vs MAE when dependent variable is between 0-1

I am comparing error estimates from different models. I am looking at MAE (mean absolute error) and RMSE (root mean squared error) as my choice of error estimates.
But the problem is that mostly they are not agreeing with each other.

  • Can there be any bias in the estimate, such that one of the error estimates will
    favour one type of model?

  • The RMSE gives higher weight to large errors as it squares before taking the average, does this work well even for the cases where dependent variable
    lies between [0-1]?

(There is a similar discussion on the netflix prize forum here.)

  • I am little confused here, under specified circumstances can I choose a
    particular error estimate for identifying the best model ?

Least squares is an optimal property when the error distribution is Gaussian. But when the error distribution is heavytailed or there are outliers it is a bad criterion because it gives too much weight to the observations with large errors. So when robust estimation is more appropriate MAE is a better criterion than RMSE. This applies regardless of whether or not the dependent variable is constrained or unconstrained.

Similar Posts:

Rate this post

Leave a Comment