I have a question regarding Maximum Likelihood Estimate in linear regression model without intercept. I have a model:
$$Y_i=beta X_i +epsilon_i, i=1,…,n$$
where $epsilon_i$ are i.i.d. $N(0, sigma^2)$
I concluded that the log-likelihood function looks like this:
$$l(beta,sigma)=sum_{i=1}^n left( ln(frac{1}{sqrt{2pi}}) -ln(sigma) -frac{(y_i-(beta x_i))^2}{2sigma^2} right)$$
Easy part of this question is MLE of $sigma$ and MLE of $beta$.
But what I really don't know how to evaluate is MLE of $frac{beta}{sigma}$.
What crossed my mind is to only set MLE of $frac{beta}{sigma}=frac{beta_{MLE}}{sigma_{MLE}}$.
But I believe that this is not the right solution.
Any hints?
Best Answer
A maximum likelihood estimator has the nice property that it is invariant under transformations. This means that if $theta_{MLE}$ is the MLE for $theta$, then for a function $g(theta)$, $g(theta_{MLE})$ is the MLE for $g(theta)$.
This can be directly applied to your problem. Hint: what is the MLE for $(beta, sigma)$?
Similar Posts:
- Solved – Relationship between noise term ($epsilon$) and MLE solution for Linear Regression Models
- Solved – Maximum likelihood estimate: Is this possible to solve
- Solved – Maximum likelihood estimate: Is this possible to solve
- Solved – Calculate the autocorr. function of ARMA process
- Solved – Expected Value and Variance of Estimation of Slope Parameter $beta_1$ in Simple Linear Regression