Recurrence and Ergodicity
Consider a Markov chain taking its values in E = ℕ. There is a possibility that for any initial state i ∈ ℕ the chain will never visit i after some finite random time. This is often an undesirable feature. For example, if the chain counts the number of customers waiting in line at a service counter (we shall see Markovian models of waiting lines, or queues, at different places in this book), such a behavior implies that the waiting line will eventually go beyond the limits of the waiting facility. In a sense, the corresponding system is unstable.
KeywordsMarkov Chain Stationary Distribution Invariant Measure Transition Matrix Ergodic Theorem
Unable to display preview. Download preview PDF.