I have some trouble understanding the variance reduction method called
"Antithetic variables":
Suppose that the integrand is $g(x)=x^2$ and the reference density $f(x)=e^{-x}I_{[0,infty]}$ is the Exp(1) density. The associated cdf is $$F_X(x)={1-e^{-x}}mathbb{I}_{[0,infty]}(x)$$
The antithetic variable method then uses the inverse cdf representation
begin{align*}
U&=1-e^{-X}\ X&=-log(1-U)
end{align*} where $U~U[0,1]$. To find an antithetic variable, I used begin{align*}U^*&=1-U\ 1-U&=1-e^{-X^*}\X^*&=-log(U)end{align*}
Therefore, given $U$,
$$Y=-log(1-U),Y^*=-log(U)$$ and $Y$ and $Y^*$ are Exp(1) variables. Both can be used in a Monte Carlo integration as e.g.
$$int_0^infty g(x)f(x)text(d)x approx frac{1}{2n}sum_{i=1}^n (Y_i+Y^*_i)$$
Please correct me where I'm wrong
Best Answer
All the theory you need
$Zsim F$, and you want to estimate $mathrm{E}[Z]$.
Let $sigma^2=mathrm{Var}[Z]<infty$.
Simple Monte Carlo
Construct $X_1,X_2,dots$ IID with $X_1sim F$.
Define $bar{X}_n=frac{1}{n}sum_{i=1}^n X_i$.
Result: $mathrm{Var}[bar{X}_n]=sigma^2/n$.
Strong Law: $bar{X}_ntomathrm{E}[Z]$ a.s.
Antithetic Variables
Construct $X'_1,X'_2,dots$ such that, for $igeq 1$,
- $X'_isim F$;
- $mathrm{Cov}[X'_{2i-1},X'_{2i}]<0$;
- The pairs $(X'_1,X'_2),(X'_3,X'_4) ,dots$ are IID.
Define $Y_i=(X'_{2i-1}+X'_{2i})/2$ and $bar{Y}_n=frac{1}{n}sum_{i=1}^n Y_i$.
Result: $$mathrm{Var}[bar{Y}_n] =frac{sigma^2+mathrm{Cov}[X'_1,X'_2]}{2n} < frac{sigma^2}{2n}<mathrm{Var}[bar{X}_n].$$
Strong Law: $bar{Y}_ntomathrm{E}[Z]$ a.s.
Your application
Define $U_1,U_2,dots$ IID $mathrm{U}[0,1]$.
For $igeq 1$, define $X'_{2i-1}=(log(1-U_i))^2$ and $X'_{2i}=(log(U_i))^2$.
Prove 1, 2, 3 above.
Remember that $U_isim 1-U_i$, and $mathrm{Cov}[U_i,1-U_i]<0$.
Also, $xmapsto (log x)^2$ is monotonically decreasing for $xin(0,1]$.
Simulation
n <- 10^6 u <- runif(n) # simple x <- (log(1-u))^2 mean(x) sqrt(var(x)/n) # antithetic y <- ((log(1-u))^2 + (log(u))^2)/2 mean(y) sqrt(var(y)/n)
Similar Posts:
- Solved – Expectation of truncated normal
- Solved – Mean and variance of the maximum of a random number of Uniform variables
- Solved – Estimate the Kullback–Leibler (KL) divergence with Monte Carlo
- Solved – Difference between two i.i.d Laplace distributions
- Solved – Conditional Expectation of Log-Normal Distribution