Probability Theory pp 89-108 | Cite as

# Markov Chains

## Abstract

A stochastic process or a random process is a family of random variables *X*_{t},*t* ∈ *I*, taking values in some measurable space (*E*, ξ), with *I* = an interval of integers or reals. I has the interpretation of a (discrete or continuous) time interval, *t* thereby being a time index. The simplest random process in discrete time is a sequence of independent random variables. This can be considered as a process without memory: the value of the process at a given time instant is independent of the values it took in the past. The next level of complication one can conceive of is that of a one step memory: the value taken by the process at time n +1 depends on its past values only through its dependence, if any, on the value at time n, not otherwise. To be precise, it is conditionally independent of its values up to time n conditioned on its value at time *n.* This is called the Markov property and the process a Markov chain.

## Keywords

Markov Chain Stationary Distribution Transition Matrix Great Common Divisor Invariant Probability Measure## Preview

Unable to display preview. Download preview PDF.