Solved – Proving that Shannon entropy is maximised for the uniform distribution

I know that Shannon entropy is defined as $-sum_{i=1}^kp_ilog(p_i)$. For the uniform distribution, $p_i=frac{1}{k}$, so this becomes $-sum_{i=1}^kfrac{1}{k}logleft(frac{1}{k}right)$.
Further rearrangement produces the following:

$-sum_{i=1}^kfrac{1}{k}log(k)^{-1}$

$sum_{i=1}^kfrac{1}{k}log(k)$

This is where I am stuck. I need the solution to come to $log(k)$. What is the next step?

Using Lagrange multipliers we have the equation:

$$mathcal{L} = left { -sum_i^k p_i log p_i – lambdaleft ( sum_i^k p_i – 1 right )right }$$

Maximizing with respect to the probability,

$$frac{partial mathcal{L}}{partial p_i} = 0 = -log p_i – 1 – lambda implies $$

$$p_i = e^{-(1+lambda)}tag{1}$$

Maximizing with respect to $lambda$:

$$frac{partial mathcal{L}}{partial lambda} = 0 = – sum_i^k p_i + 1 implies$$

$$ sum_i^k p_i = 1 tag{2}$$

Substituting equation (1) into equation (2):

$$sum_i^k e^{-(1+lambda)} = 1 implies$$

$$k e^{-(1+lambda)} = 1 $$

Since $p_i = e^{-(1+lambda)}$

$$p_i = frac{1}{k}$$

The Shannon Entropy formula now becomes

$$ H = – sum_i^k frac{1}{k}log frac{1}{k}$$

Since $k$ does not depend on the summation,

$$H = frac{k}{k} log k = log k$$

Similar Posts:

Rate this post

Leave a Comment