What is the definition of a recurrent and transient state respectively?

I ask this because while reading *Markov Chains by J.R. Norris*, he says a state $i$ is **recurrent** if

$P(X_{n}=i, mbox{ for infinitely many n }| X_{0} = i)=1$,

and a state $i$ is **transient** if

$P(X_{n}=i,mbox{ for infinitely many n} | X_{0}=i) = 0$.

I'm more familiar with the definition which states that if $P(X_{n}=i mbox{ for some } ngeq 1)=1$, then the state $i$ is said to be recurrent, else if this probability is strictly less than 1, then the state $i$ is said to be transient.

Is there a way to prove these definitions are equivalent?

I think my problem stems from not understanding the probability statement:

$P(X_{n}=i mbox{ for infinitely many n} | X_{0}=i)=1$.

What is this statement actually saying?

Any help will be much appreciated. Thanks.

**Contents**hide

#### Best Answer

These two definitions are equivalent. If the event that $X_n = t$ occurs infinite times almost surely, it occurs at least once almost sure. If it occurs once almost sure, it will occur infinite times. That's because of the property of Markov chain.