Solved – Conditional expectation of a univariate Gaussian

Suppose I have a univariate Gaussian distribution with mean $mu_X$ and standard deviation $sigma_X$, and I know the random variable $X$ is least some positive value $y$: $X geq y$. What is the conditional expectation $mathbb{E}[X | X geq y]$ of $X$ given $X geq y$? Is there a closed-form expression for this?

Let's assume you mean $Xsimmathcal{N}(mu_X, sigma_X)$ and that $y$ is a constant chosen independently of observing $X$. Reduce the problem to finding the conditional expectation of $Z = X-y$ conditional on $Z ge 0$: adding $y$ to that value gives the desired answer. (Whether $y$ is positive is immaterial.)

The governing property of conditional probability is the multiplicative relationship

$$Pr(Zinmathcal{A},|,Z ge 0)Pr(Z ge 0) = Pr(Zinmathcal{Acap[0,infty)})$$

for all measurable sets $mathcal{A}$. In particular, letting $mathcal{A}=(z,infty)$ for some $zge 0$, solve for the conditional probability:

$$Pr(Z gt z,|,Z ge 0) = frac{Pr(Z gt z)}{Pr(Z ge 0)}.$$

The left hand side is the conditional survival function while the numerator and denominator on the right are both in terms of the survival function of $Z$ itself. Write $Phi(z; mu, sigma)$ for the Normal distribution function with mean $mu$ and standard deviation $sigma$. Its complement $1-Phi$ is the survival function. Because $Z$ obviously is Normal with mean $mu_X-y$ and standard deviation $sigma_X$, the survival function of the positive part of $Z$, $Z^{+}$, is

$$S_{Z^{+}}(z) = frac{1-Phi(z; mu_X-y, sigma_X)}{1 – Phi(0; mu_X-y, sigma_X)}$$

for $z ge 0$. Its integral gives the conditional expectation. Add back $y$ to give the answer

$$y + frac{1}{1 – Phi(0; mu_X-y, sigma_X)}int_0^infty left(1 – Phi(z; mu_X-y, sigma_X)right)dz.$$

As the integral of a complementary error function, it has no simpler expression in general.

Similar Posts:

Rate this post

Leave a Comment