Markov Chains
Chapter
- 2 Citations
- 3 Mentions
- 2.9k Downloads
Abstract
A sequence of random variables \( X_0,X_1,... \) with values in a countable set S is a Markov chain if at any time n, the future states (or values) \( X_{n+1}, X_{n+2},... \) depend on the history \( X_0,...,X_n \) only through the present state \( X_n \).
Keywords
Markov Chain Random Walk Stationary Distribution Invariant Measure Transition Matrix
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Preview
Unable to display preview. Download preview PDF.
Copyright information
© Springer-Verlag Berlin Heidelberg 2009