I'm talking with my advisor about how to compute standard deviations for, say, combined standardized test scores for admissions purposes. For example, we'd be interested to compute the sum of the verbal and quantitative scores from the GRE, which are correlated, and normed to be approximately normal.

More formally, say you have a multivariate normal with vector mean and matrix covariance $X sim N(mu, Sigma)$, with $X = (X_1, X_2, …)$ (and covariances are non-zero). What is the variance of $sum{X_i}$? If it's hard to compute in general, I'm happy with bivariate for now, a recursive approach or similar.

**Contents**hide

#### Best Answer

The variance is the matrix product: $$ 1'Sigma1 $$

### Similar Posts:

- Solved – Estimate an overall correlation from the correlations within subgroups
- Solved – Combining two unequal normal distributions
- Solved – CFA lavaan Interpreting NA standardized residuals
- Solved – How to read the x-axis of this qqplot
- Solved – Some issues with standardized variable in a regression analysis