# Solved – Central Limit Theorem for square roots of sums of i.i.d. random variables

Intrigued by a question at math.stackexchange, and investigating it empirically, I am wondering about the following statement on the square-root of sums of i.i.d. random variables.

Suppose \$X_1, X_2, ldots, X_n\$ are i.i.d. random variables with finite non-zero mean \$mu\$ and variance \$sigma^2\$, and \$displaystyle Y=sum_{i=1}^n X_i\$. The central limit theorem says \$displaystyle dfrac{Y – nmu}{sqrt{nsigma^2}} xrightarrow{d} N(0,1)\$ as \$n\$ increases.

If \$Z=sqrt{|Y|}\$, can I also say something like \$displaystyle dfrac{Z – sqrt{n |mu|-tfrac{sigma^2}{4|mu|}}}{sqrt{tfrac{sigma^2}{4|mu|}}} xrightarrow{d} N(0,1)\$ as \$n\$ increases?

For example, suppose the \$X_i\$ are Bernoulli with mean \$p\$ and variance \$p(1-p)\$, then \$Y\$ is binomial and I can simulate this in R, say with \$p=frac13\$:

``set.seed(1) cases <- 100000 n <- 1000 p <- 1/3 Y <- rbinom(cases, size=n, prob=p) Z <- sqrt(abs(Y)) ``

which gives approximately the hoped-for mean and variance for \$Z\$

``> c(mean(Z), sqrt(n*p - (1-p)/4))  18.25229 18.25285 > c(var(Z), (1-p)/4)  0.1680012 0.1666667 ``

and a Q-Q plot which looks close to Gaussian

``qqnorm(Z) `` Contents