Markov Chains with Two States
A stochastic process is a collection of random variables, usually considered to be indexed by time. In this book we consider sequences of random variables X 1,X 2…, viewing the subscripts 1,2,… as successive steps in time. The values assumed by the random variables X n are called states of the process, and the set of all its states is called the state space. Sometimes it is convenient to think of the process as describing the movement of a particle over time. If X 1 = i and X2 = j, then we say that the process (or particle) has made a transition from state i at step 1 to state j at step 2. Often we are interested in the behavior of the process over the long run—after many transitions.
KeywordsMarkov Chain Transition Matrix Conditional Distribution Gibbs Sampler Markov Property
Unable to display preview. Download preview PDF.