Suppose I have $textbf{X}$ ~ $N(textbf{0},Sigma)$, and I'm considering two different linear combinations, $a^* X$ and $b^* X$, which we suppose are uncorrelated. I understand that linear combinations of variables in multivariate Gaussians are Gaussian, and that in a *multivariate Gaussian,* that uncorrelatedness implies independence.

But can I conclude from uncorrelatedness of $a^* X$ and $b^* X$ that they are independent? Do I know that $a^* X$ and $b^* X$ together form a multivariate Gaussian?

Note that uncorrelatedness of $a^* X$ and $b^* X$ does not imply a and b are orthogonal, as in 2D for a correlated Gaussian we could have a = <1,0> and $b = <frac{-sigma_{XY}}{sigma_{XX}},+1>$, which are uncorrelated but not orthogonal.

Thanks.

**Contents**hide

#### Best Answer

Do I know that a∗X and b∗X together form a multivariate Gaussian?

Yes. Form a matrix, $M$ as:

begin{align} M &= left[ array{a\b} right] end{align}

Since $X$ is Gaussian and since linear combinations of Gaussian are Gaussian, $MX$ is Gaussian. Since $MX$ is:

begin{align} MX &= left[ array{a*X\b*X} right] end{align}

we know that $aX$ and $bX$ are jointly Gaussian.

### Similar Posts:

- Solved – Multivariate Gaussian with 3 Partitions
- Solved – Deriving transition matrix from infinitesimal generator, continuous time Markov chain
- Solved – Independence of linear combination of multivariate distribution
- Solved – Deriving K-means algorithm as a limit of Expectation Maximization for Gaussian Mixtures
- Solved – Expected Value of $X sin(X)$ with $X$ normally distributed