Abstract
The topic of Markov processes is huge. A number of volumes can be, and in fact were, written on this topic. We have no intentions to be complete in this area. What is given in this chapter is the minimum required in order to follow what is presented afterwards. In particular, we will refer at times to this chapter when results we present here are called for. For more comprehensive coverage of the topic of Markov chains and stochastic matrices, see [9, 19, 41] or [42].
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
A square matrix is called stochastic if all its entries are nonnegative and all its row sums equal one. It is substochastic if its row sums are less than or equal to one.
- 2.
This does not rule out the possibility that a state j in one class is reachable from a state i in some other class (but then, of course, i is not reachable from j).
- 3.
The rationale behind this terminology is that for n large enough, P ii n > 0 if and only if n = 0 mod d(i).
- 4.
The period is a function only of the graph associated with the Markov chain. In particular, once P ij is positive, its actual value is immaterial from the point of view of deriving the period’s value.
- 5.
This proof appears in [16], p. 165.
- 6.
The proof is as follows. Of course, P ij is the probability of moving straight to state-j. In the new process there is, however, another option to visit state-j just after state-i; that is, go first to state-n (probability of P in ) and, conditioning on leaving state-n, move immediately to state-j (probability \(P_{nj}/(1 - P_{nn})\)).
- 7.
The Perron-Frobinius theorem guarantees that this eigenvalue is real and unique in the case where P JJ is aperiodic and irreducible. See, e.g., [42], p. 9.
References
Billingsley, P. (1995). Probability and measure (3rd ed.). New York: Wiley.
Denardo, E. V. (1982). Dynamic programming: models and applications. Englewood Cliffs: Prentice-Hall.
Feller, W. (1968). An introduction to probability theory and its applications (3rd ed.). New York: Wiley.
Kemeny, J. K., & Snell, J. L. (1961). Finite Markov chains. New York: D. Van Nostrand.
Ross, S. M. (1996). Stochastic processes (2nd ed.). New York: Wiley.
Seneta, E. (2006). Non-negative matrices and Markov Chains: revised prinitng. New York: Springer.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this chapter
Cite this chapter
Haviv, M. (2013). Introduction to Markov Chains. In: Queues. International Series in Operations Research & Management Science, vol 191. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-6765-6_3
Download citation
DOI: https://doi.org/10.1007/978-1-4614-6765-6_3
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-6764-9
Online ISBN: 978-1-4614-6765-6
eBook Packages: Business and EconomicsBusiness and Management (R0)