Suppose $X_1, X_2,…,X_n$ is a random sample from a $text{Poisson} (theta)$ distribution with probability mass function:

$$P(X=x)=frac{theta^ {x}e^{-theta}}{x!}, x=1,2,…; 0<theta$$

What is the maximum likelihood estimator for: $e^{-theta}= P(X = 0)$?

I already found the MLE for the $theta$. How do you then find the MLE of $P(X = 0)$ which is equal to $e^{-theta}$ ?

**Contents**hide

#### Best Answer

Invariance principle : The maximum likelihood estimator of the transform is the transform of the maximum likelihood estimator.

### Similar Posts:

- Solved – Maximum likelihood estimator of the exponential distribution with two same parameter
- Solved – Sufficient estimator for Bernoulli distribution using the likelihood function theorem for sufficiency
- Solved – Maximum likelihood estimator for $theta$ and $E[X]$
- Solved – When re-parametrizing a likelihood function, is it enough just to plug in the transformed variable instead of a change of variables formula
- Solved – Proving no unbiased estimator exists for $theta^{-1}$ for Poisson Dist. Why is $X_1,…,X_n = sum{X_i}$?