# Overview of Markov Chain Monte Carlo for Statistical Inference and its Application

• Tao Bo
• Chin Teck Chai
Conference paper
Part of the Advances in Soft Computing book series (AINSC, volume 14)

## Abstract

This paper presents an overview of Markov Chain Monte Carlo (MCMC) methods for statistical inference and applications. The article begins by describing ordinary Monte Carlo methods, which in principle has the same goals as the MCMC but can hardly be implemented in practice. Following that basic Markov Chain Monte Carlo is discussed, which is founded on the Hastings algorithm and includes Metropolis method and the Gibbs sampler as special cases. Finally, various special applications of Markov Chain Monte Carlo methods are briefly mentioned and some recent development of MCMC are covered in final remarks section.

## Keywords

Real Estate Markov Chain Monte Carlo Gibbs Sampler Detailed Balance Markov Chain Monte Carlo Method
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## References

1. [1]
Marsaglia, G. and Zaman, A., The KISS generator. Tech Report, Dept. of Statistics, University of Florida. 1993Google Scholar
2. [2]
Knuth, D., The art of computer programming Vol. 2. Addison-Wesley, Reading. 1981.
3. [3]
Penttinen, A., Modeling interaction in spatial point patterns: parameter estimation by the maximum likelihood method. Jyvaskyla studies in Computer Science, Economics and Statistics. 1984.Google Scholar
4. [4]
Geyer, C.J., Markov Chain Monte Carlo maximum likelihood. In: Computing Science and Statistics: Proceedings of the 23 Symposium on the Interface, 156–163. Interface of the International Statistical Institute, 1991.Google Scholar
5. [5]
Geyer, C.J., and Thompson, Constrained Monte Carlo maximum likelihood for dependent data (with discussion). Journal of the Royal Statistical Society B. 657–699, 1992.Google Scholar
6. [6]
Hastings, W.K., Monte Carlo sampling methods using Markov chains and theire applications. Biometika 1970.Google Scholar
7. [7]
Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H. and Teller, Equation of state calculations by fast computing machines. Journal of Chemical Physics, 1087–1092, 1953.Google Scholar
8. [8]
Geman, S. and Geman, Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images, IEEE Trans. Pattn. Anal and Mach. Intell. 721–741.Google Scholar
9. [9]
Dani Gamerman: Markov Chain Monte Carlo, Stochastic simulation for Bayesian inference, Chapman\and Hall: London 1997.Google Scholar
10. [10]
John R. Knight, C.F. Sirmans, Alan E. Gelfand, and Sujit K. Ghosh: Analyzing real estate data problems using the Gibbs Sampler, Real Estate Economics, V26 3: pp. 469–492, 1998.
11. [11]
Little, R.J.A. and D.B. Rubin: Statistical analysis with missing Data. John Wiley and Sons: New York. 1989.Google Scholar
12. [12]
Mitchell, M. and Hofstadter, D.R., The emergence of understanding in a computer model of concepts and analogy-making.Physica D, vol 42,pp, 322–334. 1990.Google Scholar
13. [13]
Neal, R.M., Connectionist learning of belief networks. artificial intelligence vol. 56, pp 71–113 1992.
14. [14]
Creutz, M., Global Monte Carlo algorithm for many-ferminon systems Physical Review D, vol. 63, pp. 1228–1238. 1988.Google Scholar