We often study Gaussian Mixture model as a useful model in machine learning and its applications.

What is the physical significance of this "*Mixture*"?

Is it used because a Gaussian Mixture Model models the probability of a number of random variables each with its own value of mean? If not, then what is the correct interpretation of this word.

**Contents**hide

#### Best Answer

A mixture distribution combines different component distributions with weights that typically sum to one (or can be renormalized). A gaussian-mixture is the special case where the components are Gaussians.

For instance, here is a mixture of 25% $N(-2,1)$ and 75% $N(2,1)$, which you could call "one part $N(-2,1)$ and three parts $N(2,1)$":

`xx <- seq(-5,5,by=.01) plot(xx,0.25*dnorm(xx,-2,1)+0.75*dnorm(xx,2,1),type="l",xlab="",ylab="") `

Essentially, it's like a recipe. Play around a little with the weights, the means and the variances to see what happens, or look at the two tags on CV.

### Similar Posts:

- Solved – Gaussian Mixture Model (GMM) has a maximum likelihood when two clusters one includes the other
- Solved – Implementing Gaussian mixture model for a HMM library
- Solved – Finding the point of maximum probability in a mixture of gaussians
- Solved – Finding the point of maximum probability in a mixture of gaussians
- Solved – Central moments of a gaussian mixture density