Suppose we have a random variable $X = [x_1, x_2, …, x_m]$, that is distributed $Binomial(n,p)$, with known $n$ and unknown $p$.

Now, assume we want to estimate $p$. Usually, textbooks and articles online give that the MLE of $p$ is $frac{sum_{i=1}^{m}x_i}{n}$. However, isn't it correct only when $m=1$, or in other words, when we end up having merely a $Bernoulli$ distribution?

If so, wouldn't it be more precise to say that the MLE of $p$ is actually $frac{sum_{i=1}^{m}x_i}{mn}$?

**Contents**hide

#### Best Answer

You are right, while even the credible sources sometimes claim differently, the correct formula is

$$ hat p = frac{1}{mn} sum_{i=1}^m x_i = overbrace{frac{1}{m} sum_{i=1}^m}^text{we have m trials} overbrace{frac{x_i}{n}}^{substack{text{proportion of successes} \ text{in single Bernoulli trial}}} $$

since if you calculated ordinary arithmetic mean of $X$, you'd get average number of successes in $n$ trials, i.e. $E(X) = n hat p$.

### Similar Posts:

- Solved – On the MLE of p in Bernoulli and Binomial distributions
- Solved – On the MLE of p in Bernoulli and Binomial distributions
- Solved – Maximum Likelihood estimator for family of binomial distributions
- Solved – Probability of $k$ successes in no more than $n$ Bernoulli trials
- Solved – How to calculate the binomial probability when expected frequency is a random variable