# Solved – Maximum likelihood estimator of \$theta\$ for uniform distribution

I know that for uniformly distributed random variables $$X_1,X_2,dots,X_n$$ $$in mathcal{R}$$, the p.d.f. is given by:

$$f(x_i) = 1/θ$$ ; if $$0≤x_i≤θ$$

$$f(x) = 0$$ ; otherwise

If the uniformly distributed random variables are arranged in the following order

$$0≤X_1≤X_2≤X_3dots ≤X_n≤θ$$,

I understand that the likelihood function is given by

$$L(θ)=prod_{i=1}^{n}f(x_i)=θ^{−n}$$

The log-likelihood is:

$$ln L(θ)=−nln(θ)$$

Setting its derivative with respect to parameter $$theta$$ to zero, we get:

$$frac{mathrm d}{mathrm dtheta}ln L(theta)=-ntheta$$

which is $$< 0$$ for $$θ > 0$$

Hence, $$L(θ)$$ is a decreasing function and it is maximized at $$θ = X_{(n)}$$

The maximum likelihood estimate is thus

$$hat{θ} = X_{(n)}$$

My question is:—what if I find the supremum to solve this?

Contents

The result is correct, but the reasoning is somewhat inaccurate. You need to keep track of the property that the density is zero outside $$[0,theta]$$. This implies that the likelihood is zero to the left of the sample maximum, and jumps to $$theta^n$$ in the maximum. It indeed decreases afterwards, so that the maximum is the MLE.