Is there any advantage of a maximum-likelihood estimator over another estimator (for example least squares) assuming both methods of estimation are efficient, unbiased, and consistent?
I have been taught that maximum-likelihood methods are generally superior but I don't know why.
Contents
hide
Best Answer
The MLE is more efficient when the distributional assumptions are correctly specified. For instance, if the linear model satisfies, $Y = b X + epsilon$ where $epsilon$ comes from, say, a shifted 0 meanWeibull distribution, the MLE will account for the skewness of the errors and the LS will not. It happens to be the case that when $epsilon$ is normally distributed, you arrive at the least squares estimator anyway.
Similar Posts:
- Solved – Does efficiency imply unbiased and consistency
- Solved – How to find the OLS estimator of variance of error
- Solved – When would maximum likelihood estimates equal least squares estimates
- Solved – When would maximum likelihood estimates equal least squares estimates
- Solved – When would maximum likelihood estimates equal least squares estimates