Advertisement

Fundamental defintions

  • Kai Lai Chung
Chapter
Part of the Die Grundlehren der Mathematischen Wissenschaften book series (volume 104)

Abstract

The precise definition of the term “Markov chain” as used in this monograph will be given below. However, the following remarks will help clarify our usage for the benefit of those readers who have had previous contact with the terminology. A Markov process is a special type of stochastic process distinguished by a certain Markov property; a Markov chain is a Markov process with a denumerable (namely, finite or denumerably infinite) number of states. The time parameter may be taken to be the set of nonnegative integers or the set of nonnegative real numbers; accordingly we have the discrete parameter case or the continuous parameter case. The adjective “simple” is sometimes used to qualify our Markov chain, but since we do not discuss “multiple” chains we shall not make the distinction. Furthermore, we shall discuss only Markov chains “with stationary (or temporally homogeneous) transition probabilities” so that the qualifying phrase in quotes will be understood. Finally, our discussion does not differentiate between a finite or a denumerably infinite number of states so that no special treatment is given to the former case.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag OHG. Berlin · Göttingen · Heidelberg 1960

Authors and Affiliations

  • Kai Lai Chung
    • 1
  1. 1.Syracuse UniversityUSA

Personalised recommendations