Abstract
In the decades since Markov chain Monte Carlo methods were first introduced, they have revolutionised Bayesian approaches to statistical inference. Each new advance in MCMC methodology produces near immediate benefits for Bayesian practitioners, expanding the range of problems they can feasibly solve. In this paper, we explore ways in which Bayesian approaches can return something of the debt owed to MCMC, by using explicitly Bayesian concepts to aid in the design of MCMC samplers. The art of efficient MCMC sampling lies in designing a Markov process that (a) has the required limiting distribution, (b) has good convergence and mixing properties and (c) can be implemented in a computationally efficient manner. In this paper, we explore the idea that the selection of an appropriate process, and in particular the tuning of the parameters of the process to achieve the above goals, can be regarded as a problem of estimation. As such, it is amenable to a conventional Bayesian approach, in which a prior distribution for optimal parameters of the sampler is specified, data relevant to sampler performance is obtained and a posterior distribution for optimal parameters is formed. Sampling from this posterior distribution can then be incorporated into the MCMC sampler to produce an adaptive method. We present a new MCMC algorithm for Bayesian adaptive Metropolis-Hasting sampling (BAMS), using an explicitly Bayesian inference to update the proposal distribution. We show that author Keith’s earlier Bayesian adaptive independence sampler (BAIS) and a new Bayesian adaptive random walk sampler (BARS) emerge as instances. More important than either of these instances, BAMS provides a general framework within which to explore adaptive schemes that are guaranteed to converge to the required limiting distribution.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Andrieu, C., Moulines, E.: On the ergodicity properties of some adaptive MCMC algorithms. Ann. Appl. Probab. 16, 1462–1505 (2006)
Andrieu, C., Thoms, J.: A tutorial on adaptive MCMC. Stat. Comput. 18, 343–373 (2008)
Gelman, A., Roberts, G.O., Gilks, W.R.: Efficient Metropolis jumping rules. Bayesian Statist. 5, 599–607 (1996)
Haario, H., Laine, M., Mira, A., Saksman, E.: DRAM: efficient adaptive MCMC. Stat. Comput. 16, 339–354 (2006)
Hastings, W.K.: Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57, 97–109 (1970)
Higdon, D.M.: Auxiliary variable methods for Markov chain Monte Carlo with applications. J. Amer. Statist. Stat. Assoc. 93, 585–595 (1998)
Keith, J.M., Kroese, D.P., Bryant, D.: A generalized Markov sampler. Methodol. Comput. Appl. Probab. 6, 29–53 (2004)
Keith, J.M., Kroese, D.P., Sofronov, G.Y.: Adaptive independence samplers. Stat. Comput. 18 409–420 (2008)
Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H.: Equations of state calculations by fast computing machines. J. Chem. Phys. 21, 1087–1092 (1953)
R Development Core Team: R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna. ISBN 3-900051-07-0. http://www.R-project.org (2008)
Roberts, G.O., Rosenthal, J.S.: Examples of adaptive MCMC. J. Comput. Graph. Statist. 18, 349–367 (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Keith, J.M., Davey, C.M. (2013). Bayesian Approaches to the Design of Markov Chain Monte Carlo Samplers. In: Dick, J., Kuo, F., Peters, G., Sloan, I. (eds) Monte Carlo and Quasi-Monte Carlo Methods 2012. Springer Proceedings in Mathematics & Statistics, vol 65. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41095-6_22
Download citation
DOI: https://doi.org/10.1007/978-3-642-41095-6_22
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-41094-9
Online ISBN: 978-3-642-41095-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)