Markov Chains: Basic Definitions

  • Randal DoucEmail author
  • Eric Moulines
  • Pierre Priouret
  • Philippe Soulier
Part of the Springer Series in Operations Research and Financial Engineering book series (ORFE)


Heuristically, a discrete-time stochastic process has the Markov property if the past and future are independent given the present. In this introductory chapter, we give the formal definition of a Markov chain and of the main objects related to this type of stochastic process and establish basic results. In particular, we will introduce in Section 1.2 the essential notion of a Markov kernel, which gives the distribution of the next state given the current state. In Section 1.3, we will restrict attention to time-homogeneous Markov chains and establish that a fundamental consequence of the Markov property is that the entire distribution of a Markov chain is characterized by the distribution of its initial state and a Markov kernel. In Section 1.4, we will introduce the notion of invariant measures, which play a key role in the study of the long-term behavior of a Markov chain. Finally, in Sections 1.5 and 1.6, which can be skipped on a first reading, we will introduce the notion of reversibility, which is very convenient and is satisfied by many Markov chains, and some further properties of kernels seen as operators and certain spaces of functions.

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Randal Douc
    • 1
    Email author
  • Eric Moulines
    • 2
  • Pierre Priouret
    • 3
  • Philippe Soulier
    • 4
  1. 1.Département CITITelecom SudParisÉvryFrance
  2. 2.Centre de Mathématiques AppliquéesEcole PloytechniquePalaiseauFrance
  3. 3.Université Pierre et Marie CurieParisFrance
  4. 4.Université Paris NanterreNanterreFrance

Personalised recommendations