Examples of Markov Chains with Larger State Spaces
In Chapter 6, we took advantage of the simplicity of 2-state chains to intro- duce fundamental ideas of Markov dependence and long-run behavior using only elementary mathematics. Markov chains taking more than two values are needed in many simulations of practical importance. These chains with larger state spaces can behave in very intricate ways, and a rigorous mathematical treatment of them is beyond the scope of this book. Our approach in this chapter is to provide examples that illustrate some of the important behaviors of more general Markov chains.
KeywordsMarkov Chain State Space Random Walk Gibbs Sampler Bivariate Normal Distribution
Unable to display preview. Download preview PDF.