Solved – Advantage of MLE over LS or other estimator

Is there any advantage of a maximum-likelihood estimator over another estimator (for example least squares) assuming both methods of estimation are efficient, unbiased, and consistent?

I have been taught that maximum-likelihood methods are generally superior but I don't know why.

The MLE is more efficient when the distributional assumptions are correctly specified. For instance, if the linear model satisfies, $Y = b X + epsilon$ where $epsilon$ comes from, say, a shifted 0 meanWeibull distribution, the MLE will account for the skewness of the errors and the LS will not. It happens to be the case that when $epsilon$ is normally distributed, you arrive at the least squares estimator anyway.

Similar Posts:

Rate this post

Leave a Comment