# Solved – Deriving Posterior Binomial Density from Uniform Prior

I'm trying to derive the posterior density of the probability parameter of a binomial random variable, given one realization of the random variable and a uniform prior density on the probability parameter.

``X ~ Binomial(m, p) π(p) ~ Uniform[0, 1] ``

I know that to derive the posterior density, f(p|x), I want to divide the joint density f(x|p)π(p) by the marginal of X. Since the prior is uniform, I have that the joint density f(x|p)π(p) is just f(x|p), the binomial density.

However, I'm a bit at a loss for the marginal density of X. I think that I obtain it by integrating f(x|p) over all p in [0,1], producing 1/(m+1).

If I do this, I get that the posterior density is

``(m+1) * choose(m,x) * p^x *(1-p)^(m-x) ,  ``

which doesn't seem right. Am I doing anything wrong here?

Contents

A simpler approach is to ignore all the terms not involving \$p\$ until the end, then either 1) recognize what the distribution is, thus enabling you to write down the constant of integration from memory or other source, 2) perform the necessary integration to derive the constant of integration, or 3) give up trying to find an analytic form for the constant of integration and resort to simulation-based methods, e.g., MCMC, instead.

In this case, we'd start out with \$f(x|p) propto p^x(1-p)^{m-x}\$, ignoring the rest of \$f\$ as it's not a function of \$p\$. Your prior is, as you have stated, just a constant, so it too can be ignored. This means that your posterior \$f(p|x) propto p^x(1-p)^{m-x}\$, which can be recognized as the essential part of a \$beta(x+1,m-x+1)\$ distribution. The constant of integration is just \$1/text{B}(x+1,m-x+1)\$.

To compare to your result, we expand the Beta function, convert its constituent Gamma functions to factorials, and rewrite:

\$\${ 1 over text{B}(x+1,m-x+1)} = {Gamma(m+2) over Gamma(x+1)Gamma(m-x+1)}={(m+1)!over x!(m-x)!} = (m+1) {m choose x}\$\$

as you derived!

Rate this post