Advertisement

Markov Chains with Two States

  • Eric A. Suess
  • Bruce E. Trumbo
Chapter
Part of the Use R book series (USE R, volume 0)

Abstract

A stochastic process is a collection of random variables, usually considered to be indexed by time. In this book we consider sequences of random variables X 1,X 2…, viewing the subscripts 1,2,… as successive steps in time. The values assumed by the random variables X n are called states of the process, and the set of all its states is called the state space. Sometimes it is convenient to think of the process as describing the movement of a particle over time. If X 1 = i and X2 = j, then we say that the process (or particle) has made a transition from state i at step 1 to state j at step 2. Often we are interested in the behavior of the process over the long run—after many transitions.

Keywords

Markov Chain Transition Matrix Conditional Distribution Gibbs Sampler Markov Property 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Department of Statistics and BiostatisticsCalifornia State University, East BayHaywardUSA

Personalised recommendations