Abstract
Heuristically, a discrete-time stochastic process has the Markov property if the past and future are independent given the present. In this introductory chapter, we give the formal definition of a Markov chain and of the main objects related to this type of stochastic process and establish basic results. In particular, we will introduce in Section 1.2 the essential notion of a Markov kernel, which gives the distribution of the next state given the current state. In Section 1.3, we will restrict attention to time-homogeneous Markov chains and establish that a fundamental consequence of the Markov property is that the entire distribution of a Markov chain is characterized by the distribution of its initial state and a Markov kernel. In Section 1.4, we will introduce the notion of invariant measures, which play a key role in the study of the long-term behavior of a Markov chain. Finally, in Sections 1.5 and 1.6, which can be skipped on a first reading, we will introduce the notion of reversibility, which is very convenient and is satisfied by many Markov chains, and some further properties of kernels seen as operators and certain spaces of functions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Douc, R., Moulines, E., Priouret, P., Soulier, P. (2018). Markov Chains: Basic Definitions. In: Markov Chains. Springer Series in Operations Research and Financial Engineering. Springer, Cham. https://doi.org/10.1007/978-3-319-97704-1_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-97704-1_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97703-4
Online ISBN: 978-3-319-97704-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)