Advertisement

The Multi-Stage Gibbs Sampler

  • Christian P. Robert
  • George Casella
Chapter
Part of the Springer Texts in Statistics book series (STS)

Abstract

After two chapters of preparation on the slice and two-stage Gibbs samplers, respectively, we are now ready to envision the entire picture for the Gibbs sampler. We describe the general method in Section 10.1, whose theoretical properties are less complete than for the two-stage special case (see Section 10.2): The defining difference between that sampler and the multi-stage version considered here is that the interleaving structure of the two-stage chain does not carry over. Some of the consequences of interleaving are the fact that the individual subchains are also Markov chains, and the Duality Principle and Rao-Blackwellization hold in some generality. None of that is true here, in the multi-stage case. Nevertheless, the multi-stage Gibbs sampler enjoys many optimality properties, and still might be considered the workhorse of the MCMC world. The remainder of this chapter deals with implementation considerations, many in connection with the important role of the Gibbs sampler in Bayesian Statistics.

Keywords

Markov Chain Posterior Distribution Conditional Distribution Gibbs Sampler Transition Kernel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

  1. Geman, S. and Geman, D. (1984). Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell., 6: 721–741.zbMATHCrossRefGoogle Scholar
  2. Hastings, W. (1970). Monte Carlo sampling methods using Markov chains and their application. Biometrika, 57: 97–109.zbMATHCrossRefGoogle Scholar
  3. Peskun, P. (1973). Optimum Monte Carlo sampling using Markov chains. Biometrika, 60: 607–612.MathSciNetzbMATHCrossRefGoogle Scholar
  4. Besag, J. and Clifford, P. (1989). Generalized Monte Carlo significance tests. Biometrika, 76: 633–642.MathSciNetzbMATHCrossRefGoogle Scholar
  5. Broniatowski, M., Celeux, G., and Diebolt, J. (1984). Reconnaissance de mélanges de densités par un algorithme d’apprentissage probabiliste. In Diday, E., editor, Data Analysis and Informatics, volume 3, pages 359–373. North-Holland, Amsterdam.Google Scholar
  6. Qian, W. and Titterington, D. (1990). Parameter estimation for hidden Gibbs chains. Statis. Prob. Letters, 10: 49–58.MathSciNetzbMATHCrossRefGoogle Scholar
  7. Tanner, M. and Wong, W. (1987). The calculation of posterior distributions by data augmentation. J. American Statist. Assoc., 82: 528–550.MathSciNetzbMATHCrossRefGoogle Scholar
  8. Spiegelhalter, D., Thomas, A., Best, N., and Gilks, W. (1995a). BUGS: Bayesian inference using Gibbs sampling. Technical report, Medical Research Council Biostatistics Unit, Institute of Public Health, Cambridge Univ.Google Scholar
  9. Spiegelhalter, D., Thomas, A., Best, N., and Gilks, W. (1995b). BUGS examples. Technical report, MRC Biostatistics Unit, Cambridge Univ.Google Scholar
  10. Spiegelhalter, D., Thomas, A., Best, N., and Gilks, W. (1995c). BUGS examples. Technical report, MRC Biostatistics Unit, Cambridge Univ.Google Scholar
  11. Dette, H. and Studden, W. (1997). The Theory of Canonical Moments with Applications in Statistics, Probability and Analysis. John Wiley, New York.Google Scholar
  12. Casella, G., Lavine, M., and Robert, C. (2001). Explaining the perfect sampler. The American Statistician, 55 (4): 299–305.MathSciNetCrossRefGoogle Scholar
  13. Whittaker, J. (1990). Graphical Models in Applied Multivariate Statistics. John Wiley, Chichester.zbMATHGoogle Scholar
  14. Madigan, D. and York, J. (1995). Bayesian graphical models for discrete data. International Statistical Review, 63: 215–232.zbMATHCrossRefGoogle Scholar
  15. Lauritzen, S. (1996). Graphical Models. Oxford University Press, Oxford.Google Scholar
  16. Spiegelhalter, D., Dawid, A., Lauritzen, S., and Cowell, R. (1993). Bayesian analysis in expert systems (with discussion). Statist. Science, 8: 21. 9–283.MathSciNetGoogle Scholar
  17. Spiegelhalter, D. and Lauritzen, S. (1990). Sequential updating of conditional probabilities on directed graphical structures. Networks, 20: 579–605.MathSciNetzbMATHCrossRefGoogle Scholar
  18. Dellaportas, P. and Forster, J. (1996). Markov chain Monte Carlo model determination for hierarchical and graphical log-linear models. Technical report, Univ. of Southampton.Google Scholar

Copyright information

© Springer Science+Business Media New York 2004

Authors and Affiliations

  • Christian P. Robert
    • 1
  • George Casella
    • 2
  1. 1.CEREMADEUniversité Paris DauphineParis Cedex 16France
  2. 2.Department of StatisticsUniversity of FloridaGainesvilleUSA

Personalised recommendations