I am trying to understand conceptually what does the following give me or tell me:
$$int f(x) cdot g(x) , dx$$
where $f(x)$ is any continuous function of $x$ and $g(x)$ is the probability density function for a random variable, for example a normal distribution's PDF is:
$$ g(x) = frac1 {2 pi sigma^2} expleft(frac{- (x – mu)^2 }{ 2 sigma ^2}right) $$
I understand the integral of a PDF gives me the CDF. So:
$$int_{-infty}^0 g(x) , dx$$
Gives me the probability of $x$ being less than $0$. However, what happens when you multiply $g(x)$ by another function $f(x)$ and take the integral? I heard it gives you the expected payoff assuming $f(x)$ is a function of payoffs and you take an integral from -infinity to +infinity. This, if true, I conceptually understand. sum of payoffs times the probabilities is the expected value of whatever game you are playing.
I start getting confused when the boundaries of the integral are not $pm infty$. I'm not sure in that case what integral conceptually means. For example:
$$int_{-infty}^0 f(x) g(x) , dx$$
What does that tell me?
Best Answer
Suppose $g$ is the pdf of random variable $X$, then
$$E[f(X)|X in A]= frac{int_A f(x)g(x) , dx}{int_A g(t) , dt}$$
Hence $$int_A f(x) g(x) , dt = Pr(X in A) E[f(X)|X in A],$$
it gives you the product of the conditional expectation of $f(X)$ given that $X in A$ and the probability that $X$ is in $A$.
I think $E[f(X)|X in A]$ is a more interesting quantity, and I would divide your integral with $Pr(X in A)$ to recover it.
Similar Posts:
- Solved – Integrating Gaussian white noise over a Gaussian density
- Solved – Conditional Expectation of Log-Normal Distribution
- Solved – Conditional Expectation of Log-Normal Distribution
- Solved – Variance of a continuous uniformly distributed random variable
- Solved – consistency of mle of double exponential distribution