Solved – Relationship between autocorrelation function and mean of random process

For instance, if we assume a white noise process $w(t)$, it would have an autocorrelation function $R=delta(t)$. Obviously, the white noise process would have zero mean, but how can the autocorrelation function be use to prove that?

White noise is generally defined to be a zero-mean process whose autocorrelation function is $Kdelta(t)$ where $K > 0$ and $delta(t)$ denotes the Dirac delta, and so there is nothing left to prove. Alternatively, a white noise process is a weakly stationary process whose power spectral density $S(f)$ has constant positive value $K$ for all $f$.

For any weakly stationary (also called wide-sense stationary) random process whose autocorrelation function $R(t)$ is an ordinary non-periodic real-valued even function with a peak at $t=0$ (e.g. $e^{-|t|}$) such that the limiting value of $R(t)$ as $t to infty$ exists, the limiting value is the square of the mean $mu$ and the power spectral density $S(f)$ includes a Dirac delta at $f=0$. Such a power spectral density does not match the power spectral density of white noise at all. So, why can't this notion of $mu^2 = lim_{t to infty} R(t)$ be applied to the autocorrelation function $Kdelta(t)$ of white noise to deduce that white noise must have zero mean? Well, $delta(t)$ is not a function at all; it only walks like a duck and quacks like a duck inside various integrals, not everywhere like all actual ducks do. In particular, $delta(5)$, say, does not mean the value of the Dirac delta at $t=5$, and so the notion $lim_{t to infty} delta(t)$ has no meaning.

Similar Posts:

Rate this post

Leave a Comment