Solved – Questions about antithetic variate method

Suppose we are to estimate a expectation problem $E(f(X))$, where $X$ is a random variable with known distribution, by simulation and Large Law of numbers estimator. Antithetic method is a way to reduce variance of estimator in such cases.

If $X$ is a 1D random variable with cdf $F$, antithetic method is applied as follows:

get a sample U of the uniform distribution over $[0,1]$, then $X_1=F^{-1}(U)$ has F as cdf, and $Y_1=F^{-1}(1-F(X_1)) $also has F as cdf and $X_1$ and $Y_1$ have negative correlation. Then E(f(X)) is estimated by $frac{sum_{i=1}^N f(X_i)+f(Y_i)}{2N}$.

Here are my questions:

  1. if $X_1$ and $Y_1$ have negative correlation, then to reduce the variance of estimator, is it correct that $f(X_1)$ and $f(Y_1)$ must also have negative correlation? What is the condition on $f$ for this to be true?

  2. If $X$ is a multivariate random variable, since its cdf $F$ has no quantile inverse F^{-1}, is it still possible to apply antithetic method in general cases?
    If consider the special case where the components of $X$ are independent, is it possible to apply antithetic method? How if yes?

  3. I notice that antithetic also increases the samples without actually doing more sampling of any distribution. I remember increasing sample size will also reduce variance of LLN estimators. If the variance of the estimator can be reduced by antithetic method, how much is contributed by increase of sample size and how much by introducing negative correlation between samples? If use iid samples with the same size as those samples after doubled by antithetic, which one will have less variance?

Thanks and regards!

  1. Yes. There's no simple condition. When $f$ is monotonic, $f(X_1)$ and $f(Y_1)$ will still be negatively correlated. When $f$ is not monotonic, all bets are off. For example, let $F$ be a uniform distribution on $[-1,1]$ and let $f(x) = x^2$. Then $X_1 = -Y_1$, whence $f(X_1) = f(Y_1)$, implying $f(X_1)$ and $f(Y_1)$ are perfectly correlated: you gain no additional information about the expectation from $(X_1, Y_1)$ than you do from $X_1$ alone. The cost of using the antithetic method in this extreme case is to double the sample size in order to achieve a given estimation variance.

    A practical example of the problem with non-monotonic $f$ appears here.

  2. Yes, in some cases. Use the antithetic method on the components of $X$ separately. This ought to work provided the components are not strongly correlated or when $F$ is symmetric.

  3. Provided $f(X_1)$ and $f(Y_1)$ are negatively correlated, you get smaller estimation variance with the antithetic method. As an extreme example of this, consider the case where $F$ is uniform on $[-1,1]$ and $f$ is the identity. Then for any single sample, $Y_1 = -X_1$ and their mean $(X_1+Y_1)/2 = 0$ estimates the mean of $F$ exactly; whereas the mean of two independent samples $(X_1, X_2)$ has a variance of $1/6$.

This technique seems to be related, at least in spirit, to Latin hypercube sampling.

Similar Posts:

Rate this post

Leave a Comment