Solved – Multiplying multivariate Gaussian with a univariate Gaussian

Not sure if this question makes sense but I have been spending a lot of time looking at information about expectation propagation and one of the key operations in it as explained by Tom Minka is the projection of product of distributions to the best Gaussian approximation.

So, I have a setup where my joint distribution takes the following form:
$$
P(w)times prod_{i}P(y_i|w)
$$

Now the likelihood can be written as:
$$
P(y_i|w) = frac{phi}{2pi} exp^{0.5e_iphi e_i}
$$

where let us assume for the sake of simplicity that $phi$ is given and $e$ can be calculated from the observation $y_i$.

$P(w)$, however, is a multivariate Gaussian with a covariance structure.

So, now when I approximate each of these factors in turn, I will have a product over such univariate distribution and then a product over my current approximation to $p(w)$, which is a multivariate Gaussian. I am utterly confused as to how this can be done. I know simple expressions for mean and variance exist when one is multiplying univariate Gaussians but not even sure what to do in this case. Do I need to look at the ith component of this multivariate Gaussian as well?

I would appreciate any help/suggestion with this.

Take a look at http://arxiv.org/pdf/1111.6832.pdf appendix A in particular where they work out the cavity distributions and it is the same approach.

Essentially you only add the natural parameters to the dimension which they correspond to. For example, let $K in R^{n^2}, mu in R^n $ are the natural parameters of $P(w)$ and let $sigma^2, u$ are the parameters of $P(y_i|w)$. Further let $e_i$ be a unit vector in the direction of the $i^{th}$ dimension. As such,

$P(w)P(y_i|w) = N(mu',K')$

where,

$K' = (K^{-1} + frac{1}{sigma^2}e_ie_i^T)^{-1}$

$mu' = K'(K^{-1}mu + frac{u}{sigma^2}e_i)$

This can be performed for all in one go to cut down on the cost of inverting $K$ each time.

Similar Posts:

Rate this post

Leave a Comment