Abstract
A Markov chain is a sequence of discrete random variables indexed by a time parameter with the property that the conditional probability of a future event, given the present state and the past history, does not depend on the past history. Section 4.1 presents several examples, and Section 4.2 introduces the notions of transience and recurrence and shows how to evaluate absorption probabilities. Section 4.3 is concerned with the asymptotic behavior of the n-step transition probabilities of irreducible, aperiodic Markov chains, and Section 4.4 proves the discrete renewal theorem in the aperiodic setting.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Ethier, S.N. (2010). Markov Chains. In: The Doctrine of Chances. Probability and its Applications. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-78783-9_4
Download citation
DOI: https://doi.org/10.1007/978-3-540-78783-9_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-78782-2
Online ISBN: 978-3-540-78783-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)