Solved – Independence of a linear and a quadratic form

How can I prove the following lemma?

Let $mathbf{X}^ prime$ = $ left[ X_1 , X_2 , ldots, X_n right]$ where $ X_1, X_2, ldots X_n $ are observations of a random sample from a distribution which is $N left ( 0,sigma^2 right)$. Then let $mathbf{b}^prime = left[b_1,b_2,ldots,b_n right]$ be a real nonzero vector and let $mathbf{A}$ be a real symmetric matrix of order $n$. Then $mathbf{b ^prime X}$ and $ mathbf{X} ^prime mathbf{A} mathbf{X} $ are independent iff $mathbf{b} ^prime mathbf{A}=0.$


I know that $mathbf{b^ prime X } sim N(0, sigma^2 mathbf{ b^prime b})$ but I do not see how I could proceed here. My main difficulty lies on the fact that these two variables are very different; had it been two quadratic forms instead, then Craig's theorem would be of use.

Any advice?

Thank you.

Use Craig's Theorem. Consider the quadratic form on b. If two random variables are independent, then any univariate functions of those random variables are likewise independent. The quadratic forms are independent, ergo the linear form on b and the quadratic form on A are likewise independent.

Similar Posts:

Rate this post

Leave a Comment