Abstract
A Markov chain (abbreviation: MC) is a Markov process {Z n : n ∈ T} with a discrete time parameter space T, and a finite or countably infinite state space S. We take, without losing generality, both S and T to be subsets of integers.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1974 George Allen & Unwin Ltd
About this chapter
Cite this chapter
Coleman, R. (1974). Markov Chains. In: Stochastic Processes. Problem Solvers, vol 14. Springer, Dordrecht. https://doi.org/10.1007/978-94-010-9796-3_4
Download citation
DOI: https://doi.org/10.1007/978-94-010-9796-3_4
Publisher Name: Springer, Dordrecht
Print ISBN: 978-0-04-519017-1
Online ISBN: 978-94-010-9796-3
eBook Packages: Springer Book Archive