Examples of Markov Chains with Larger State Spaces

Part of the Use R book series (USE R, volume 0)


In Chapter 6, we took advantage of the simplicity of 2-state chains to intro- duce fundamental ideas of Markov dependence and long-run behavior using only elementary mathematics. Markov chains taking more than two values are needed in many simulations of practical importance. These chains with larger state spaces can behave in very intricate ways, and a rigorous mathematical treatment of them is beyond the scope of this book. Our approach in this chapter is to provide examples that illustrate some of the important behaviors of more general Markov chains.


Markov Chain State Space Random Walk Gibbs Sampler Bivariate Normal Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Department of Statistics and BiostatisticsCalifornia State University, East BayHaywardUSA

Personalised recommendations