Can a finite state Markov chain have essential transient state?
I have found out an example for an infinite state one and I have the intuition (I may be wrong) that for a finite state space .. This isn't possible… But I am not being able to prove it..
Best Answer
According to Wikipedia,
A state $i$ is accessible from a state $j$ (written $jto i$) if a system started in state $j$ has a non-zero probability of transitioning into state $i$ at some point.
A state $i$ is essential if for all $j$ such that $i to j$ it is also true that $j to i$.
A state $i$ is said to be transient if, given that we start in state $i$, there is a non-zero probability that we will never return to $i$.
A Markov chain with an essential transient state can be constructed from three states $i,j,k$ for which $ito j$, $jto i$, $jto k$, and $k$ never returns to $i$. The transition $jto k$ guarantees $i$ is transient.
The transition matrix is
$$pmatrix{0 & 1 & 0\ 1-rho & 0 & rho \ 0 & 0 & 1}$$
for some number $rho$ with $0lt rho lt 1$. It is the chance of never returning to $i$ when starting at $i$.
Similar Posts:
- Solved – Classification of states in Markov Chain
- Solved – Classification of states in Markov Chain
- Solved – Expected number of times you spent in a state of an absorbing markov chain, given the eventual absorbing state
- Solved – Markov Chain and classes property
- Solved – Markov Chain and classes property