Solved – Entropy of a Random Variable

So I know how to calculate entropy given a specific probability:

E.g. H(0.5, 0.5) = 1 bit

However, I'm a bit unsure how to handle the random variable case. For instance:

Let X, Y be some jointly distributed random variables. What are the relationships between the below (<=, <, = , >, >=):

  1. H(X) + H(Y) and H(X + Y)

  2. H(cos(Y)) and H(Y)

I'm not really sure where to start, since we're no longer considering entropies of specific probabilities…

Any help is much appreciated!

It's essentially the same. Every random variable (r.v.) has a distribution, i.e. PMF or pdf. The entropy of the r.v. is the entropy of the associated PMF or pdf.

To determine the order of entropies listed in the question, however, it helps to know some basic properties of entropy & mutual information of random variables. There are many good textbooks on information theory that can be helpful, e.g. Cover & Thomas's, or Raymond W. Yeung's.

Using these properties, your questions can be easily answered as follows. For discrete r.v.s, we have

  1. $H(X+Y) le H(X) + H(Y)$
  2. $H(cos(Y)) le H(Y)$

Proof: By the chain rule, we have $$H(X+Y,X) = H(X) + H(X+Ymid X) = H(X) + H(Y)$$ But we also have $$H(X+Y,X) = H(X+Y) + H(Xmid X+Y)$$ Since the conditional entropy of discrete r.v.'s are nonnegative, we have our proof for the 1st inequality, $H(X+Y) le H(X) + H(Y)$.

The proof for the 2nd inequality is similar. $$H(cos(Y), Y) = H(Y) + H(cos(Y)mid Y) = H(Y),$$ since given $Y, cos(Y)$ becomes a constant. On the other hand, we also have $$H(cos(Y),Y) = H(cos(Y)) + H(Ymid cos(Y)).$$ So it's clear that $H(cos(Y)) le H(Y)$.

Note, however, the above inequalities may not necessarily hold continuous r.v.'s.

Similar Posts:

Rate this post

Leave a Comment