From Wikipedia:
a stationary process (or strict(ly) stationary process or strong(ly)
stationary process) is a stochastic process whose joint probability
distribution does not change when shifted in time. Consequently,
parameters such as the mean and variance, if they are present, also do
not change over time and do not follow any trends.
If a given Markov chain admits a limiting distribution, does it mean this Markov chain is stationary?
Edit: to be more precise, can we say the unconditional moments of a Markov chain are those of the limiting (stationary) distribution, and then, since these moments are time-invariant, the process is stationary?
Best Answer
Here's a simple example illustrating why the answer is no.
Let $$P = begin{pmatrix} 0.5 & 0.5 \ 0.5 & 0.5 end{pmatrix}$$ be the transition matrix for a first-order Markov process $X_t$ with state space $left{0, 1right}$. The limiting distribution is $pi = left(0.5, 0.5right)$. However, suppose you start the process at time zero with initial distribution $mu = left(1, 0right)$, i.e. $X_0 = 0$ with probability one.
We then have $mathbf{E}[X_0] = 0 neq mathbf{E}[X_1] = frac{1}{2}$, meaning the moments of $X_t$ depend on $t$, which violates the definition of stationarity.
Here's some R code illustrating a similar example with
$$P = begin{pmatrix} 0.98 & 0.02 \ 0.02 & 0.98 end{pmatrix}$$
p_stay <- 0.98 P <- matrix(1 - p_stay, 2, 2) diag(P) <- p_stay stopifnot(all(rowSums(P) == rep(1, nrow(P)))) mu <- c(1, 0) pi <- matrix(0, 100, 2) pi[1, ] <- mu for(time in seq(2, nrow(pi))) { pi[time, ] <- pi[time - 1, ] %*% P } plot(seq(1, nrow(pi)), pi[, 1], type="l", xlab="time", ylab="Pr[X_t = 0]") abline(h=0.5, lty=2)
The fact that $X_t$ is converging in distribution to some limit does not mean the process is stationary.
Similar Posts:
- Solved – Is a Markov chain with a limiting distribution a stationary process
- Solved – Is a Markov chain with a limiting distribution a stationary process
- Solved – Markov chains with a stationary distribution but no limiting distribution
- Solved – Time Reversible Markov Chain and Ergodic Markov Chain
- Solved – What does “mixing” mean in sampling