Independence between random variables $X$ and $Y$ implies that $text{Corr}left(f(X),g(Y)right)=0$ for arbitrary functions $f(cdot)$ and $g(cdot)$ (here is a related thread).

But is the following statement, or a similar one (perhaps more rigorously defined), correct?

If $text{Corr}left(f(X),g(Y)right)=0$ for all possible functions $f(cdot)$ and $g(cdot)$, then $X$ and $Y$ are independent.

**Contents**hide

#### Best Answer

Using indicator functions of measurable sets like$$f(x)=mathbb I_A(x)quad g(x)=mathbb I_B(x)$$leads to$$text{cov}(f(X),g(Y))=mathbb P(Xin A,Yin B)-mathbb P(Xin A)mathbb P(Yin B)$$therefore implying independence. As shown in the following snapshot of A. Dembo's probability course, proving the result for indicator functions is enough.

This is due to this monotone class theorem:

### Similar Posts:

- Solved – Prove that $text{Corr}(X^2,Y^2)=rho^2$ where $X,Y$ are jointly $N(0,1)$ variables with correlation $rho$
- Solved – invariance of correlation to linear transformation: $text{corr}(aX+b, cY+d) = text{corr}(X,Y)$
- Solved – invariance of correlation to linear transformation: $text{corr}(aX+b, cY+d) = text{corr}(X,Y)$
- Solved – invariance of correlation to linear transformation: $text{corr}(aX+b, cY+d) = text{corr}(X,Y)$