I am doing ML at my university, and the professor mentioned the term Expectation (E), while he was trying to explain some things on Gaussian processes to us. But from the way he explained it, I understood that E is the same as the mean μ. Did I understand right?
If it is the same, then do you know why both symbols are used?
Also I saw that E can be used as a function, like E($x^2$), but I didn't see that for μ.
Can someone help me understand the difference between the two better?
Expectation/Expected value is an operator that can be applied to a random variable. For discrete random variables (like binomial) with $k$ possible values it is defined as $sum_i^k x_i p(x_i)$. That is, it's the mean of the possible values weighted by the probability of those values. Continuous random variables can be thought of as the generalization of this: $int x dP$. The mean of a random variable is a synonym for expectation.
The Gaussian (normal) distribution has two parameters $mu$ and $sigma^2$. If $X$ is normally distributed, then $E(X)=mu$. So the mean of a Gaussian distributed variable is equal to the parameter $mu$.This is not always the case. Take the binomial distribution, which has parameters $n$ and $p$. If $X$ is binomially distributed, then $E(X)=np$.
As you saw, you can also apply expectation to functions of random variables so that for a gaussian $X$ you can find that $E(X^2)=sigma^2+mu^2$.
The Wikipedia page on expected values is pretty informative: http://en.wikipedia.org/wiki/Expected_value