# Solved – How to compute the PDF of a sum of bernoulli and normal variables analytically

Can convolution be applied to get a closed form expression for \$Z = X + N\$ where \$X\$ is a Bernoulli random variable and \$N\$ is a zero mean normal random variable independent of \$X\$?

Contents

Compute the CDF of \$X+N\$ using convolution, then differentiate the result.

The CDF of \$X\$ is

\$\$F_X(x) = (1-p)theta(x) + ptheta(x-1)\$\$

where \$theta\$ is the Heaviside theta function (the indicator function of the nonnegative reals),

\$\$theta(x) = 1text{ if }x ge 0, 0text{ otherwise}.\$\$

By definition, the CDF of \$X+N\$ is

\$\$F_{X+N}(y) = Pr(X+N le y) = Pr(X le y-N) =mathbb{E}(F_X(y-N)).\$\$

The last equality computes \$F_X(y-N)\$ for each possible \$N=n\$ and integrates over them all, weighting them by their probabilities \$f_N(n)dn\$. It is a convolution, written as

\$\$mathbb{E}(F_X(y-N)) = int_mathbb{R} F_X(y-n) f_N(n)dn = (F_Xstar f_N)(y).\$\$

Using the expression of \$F_X\$ in terms of Heaviside functions, linearity of integration breaks this integral into two convolutions of multiples of \$theta\$ against \$f_N\$. But computing such convolutions is trivial, because for any distribution function \$f\$ with integral \$F\$,

\$\$(theta star f)(y) = int_mathbb{R} theta(y-x)f(x)dx = int_{-infty}^y 1 f(x)dx + int_{y}^infty 0 f(x)dx = F(y).\$\$

It should be apparent that the CDF of \$X+N\$ is a linear combination of the CDF of \$N\$ and the CDF of \$N-1\$. Thus differentiation of the CDF to obtain the PDF will obtain the same linear combination of the PDFs. At this point you could simply write down the answer.

Rate this post