The Metropolis—Hastings Algorithm

  • Christian P. Robert
  • George Casella
Part of the Springer Texts in Statistics book series (STS)

Abstract

This chapter is the first of a series on simulation methods based on Markov chains. However, it is a somewhat strange introduction because it contains a description of the most general algorithm of all. The next chapter (Chapter 8) concentrates on the more specific slice sampler, which then introduces the Gibbs sampler (Chapters 9 and 10), which, in turn, is a special case of the Metropolis–Hastings algorithm. (However, the Gibbs sampler is different in both fundamental methodology and historical motivation.)

Keywords

Markov Chain Random Walk Markov Chain Monte Carlo Acceptance Rate Proposal Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

  1. Kemeny, J. and Snell, J. (1960). Finite Markov Chains. Van Nostrand, Princeton.MATHGoogle Scholar
  2. Mykland, P., Tierney, L., and Yu, B. (1995). Regeneration in Markov chain samplers. J. American Statist. Assoc., 90: 233–241.MathSciNetMATHCrossRefGoogle Scholar
  3. Schruben, L., Singh, H., and Tierney, L. (1983). Optimal tests for initialization bias in simulation output. Operation. Research, 31: 1176–1178.Google Scholar
  4. Tierney, L. (1994). Markov chains for exploring posterior distributions (with discussion). Ann. Statist., 22: 1701–1786.MathSciNetMATHCrossRefGoogle Scholar
  5. Tierney, L. (1998). A note on Metropolis-Hastings kernels for general state spaces. Ann. Applied Prob., 8 (1): 1–9.MathSciNetMATHCrossRefGoogle Scholar
  6. Tierney, L. and Kadane, J. (1986). Accurate approximations for posterior moments and marginal densities. J. American Statist. Assoc., 81: 82–86.MathSciNetMATHCrossRefGoogle Scholar
  7. Tierney, L., Kass, R., and Kadane, J. (1989). Fully exponential Laplace approximations to expectations and variances of non-positive functions. J. American Statist. Assoc., 84: 710–716.MathSciNetMATHCrossRefGoogle Scholar
  8. Tierney, L. and Mira, A. (1998). Some adaptive Monte Carlo methods for Bayesian inference. Statistics in Medicine, 18: 2507–2515.CrossRefGoogle Scholar
  9. Mira, A. and Geyer, C. (1998). Ordering Monte Carlo Markov chains. Technical report, Univ. of Minnesota.Google Scholar
  10. Mira, A., Moller, J., and Roberts, G. (2001). Perfect slice samplers. J. Royal Statist. Soc. Series B, 63: 583–606.MathSciNetCrossRefGoogle Scholar
  11. Tierney, L. and Mira, A. (1998). Some adaptive Monte Carlo methods for Bayesian inference. Statistics in Medicine, 18: 2507–2515.CrossRefGoogle Scholar
  12. Brooks, S., Dellaportas, P., and Roberts, G. (1997). A total variation method for diagnosing convergence of MCMC algorithms. J. Comput. Graph. Statist., 6: 251–265.MathSciNetGoogle Scholar
  13. Grenander, U. and Miller, M. (1994). Representations of knowledge in complex systems (with discussion). J. Royal Statist. Soc. Series B, 56: 549–603.MathSciNetMATHGoogle Scholar
  14. Phillips, D. and Smith, A. (1996). Bayesian model comparison via jump diffusions. In Gilks, W., Richardson, S., and Spiegelhalter, D., editors, Markov chain Monte Carlo in Practice, pages 215–240. Chapman and Hall, New York.Google Scholar
  15. Brooks, S., Fan, Y., and Rosenthal, J. (2002). Perfect forward simulation via simulation tempering. Technical report, Department of Statistics, Univ. of Cambridge.Google Scholar
  16. Cowles, M. and Rosenthal, J. (1998). A simulation approach to convergence rates for Markov chain Monte Carlo. Statistics and Computing, 8: 115–124.CrossRefGoogle Scholar
  17. Corcoran, J. and Tweedie, R. (2002). Perfect sampling from independent Metropolis-Hastings chains. J. Statist. Plana. Inference, 104 (2): 297–314.MathSciNetMATHCrossRefGoogle Scholar
  18. Foss, S. and Tweedie, R. (1998). Perfect simulation and backward coupling. Stochastic Models, 14: 187–203.MathSciNetMATHCrossRefGoogle Scholar
  19. Mengersen, K. and Tweedie, R. (1996). Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist., 24: 101–121.MathSciNetMATHCrossRefGoogle Scholar
  20. Meyn, S. and Tweedie, R. (1993). Markov Chains and Stochastic Stability. Springer-Verlag, New York.MATHCrossRefGoogle Scholar
  21. Roberts, G. and Tweedie, R. (1995). Exponential convergence for Langevin diffusions and their discrete approximations. Technical report, Statistics Laboratory, Univ. of Cambridge.Google Scholar
  22. Roberts, G. and Tweedie, R. (1996). Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms Biometrika, 83: 95–110.MathSciNetMATHCrossRefGoogle Scholar
  23. Roberts, G. and Tweedie, R. (2004). Understanding MCMC. Springer-Verlag, New York.Google Scholar
  24. Stramer, O. and Tweedie, R. (1999a). Langevin-type models I: diffusions with given stationary distributions, and their discretizations. Methodology and Computing in Applied Probability, 1: 283–306.MathSciNetMATHCrossRefGoogle Scholar
  25. Stramer, O. and Tweedie, R. (1999b). Langevin-type models II: Self-targeting candidates for Hastings-Metropolis algorithms. Methodology and Computing in Applied Probability, 1: 307–328.MathSciNetMATHCrossRefGoogle Scholar
  26. Stramer, O. and Tweedie, R. (1999b). Langevin-type models II: Self-targeting candidates for Hastings-Metropolis algorithms. Methodology and Computing in Applied Probability, 1: 307–328.MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2004

Authors and Affiliations

  • Christian P. Robert
    • 1
  • George Casella
    • 2
  1. 1.CEREMADEUniversité Paris DauphineParis Cedex 16France
  2. 2.Department of StatisticsUniversity of FloridaGainesvilleUSA

Personalised recommendations