Solved – ‘mean’ of logit-normal distribution

$ X sim N(mu,sigma^2) $

$ Y = frac{exp(X)}{1+exp(X)} $

Y has a logit-normal distribution.

When I have a correct estimate of the mean, say $bar{Y}$, of the logit-normal distribution,
how can use it to get a correct estimate of $mu$ ?

I don't have full sample from logit-normal distribution, but only the correct estimate of the mean.
I know that $ log(dfrac{bar{Y}}{1-bar{Y}}) $ is not the right answer due to Jensen's inequality.

The integral appearing here is the so-called logistic-normal integral. Denoting it as begin{equation} varphi(mu,sigma) = frac{1}{sqrt{2pisigma^2}}int_{-infty}^infty frac{1}{1+e^{-x}} e^{-frac{1}{2sigma^2}(x-mu)^2} end{equation}

then one needs to solve the equation $varphi(mu,sigma) = bar Y$. One must assume that $sigma$ is known in order to be able to solve this equation.

One can show that $varphi(mu,sigma)$ can be computed exactly at $mu = 0, pm sigma^2, pm 2 sigma^2, cdots $. This was shown in Pirjol (2013).

(The integral is a limiting case of the Mordell integral from analytic number theory, and has many surprising symmetry properties.) Then it is just a matter of bracketing $bar Y$ between two points on the grid of spacing $sigma^2$, and solving for $mu$.

References

Pirjol, D. (2013). "The logistic-normal integral and its generalizations." Journal of Computational and Applied Mathematics, 237(1), 460-469.

Similar Posts:

Rate this post

Leave a Comment