Skip to main content
Book cover

COMPSTAT pp 331–336Cite as

Approximate Bayesian inference for simple mixtures

  • Conference paper

Abstract

Exact likelihoods and posterior densities associated with mixture data are computationally complex because of the large number of terms involved, corresponding to the large number of possible ways in which the observations might have evolved from the different components of the mixture. This feature is partially responsible for the need to use an algorithm such as the EM algorithm for calculating maximum likelihood estimates and, in Bayesian analysis, to represent posterior densities by a set of simulated samples generated by Markov chain Monte Carlo; see for instance Diebolt and Robert (1994).

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Attias, H. (1999) Inferring parameters and structure of latent variable models by variational Bayes. In Proc. 15th Conference on Uncertainty in Artificial Intelligence.

    Google Scholar 

  • Barber, D. and Bishop, C.M. (1998) Ensemble learning for multi-layer networks. In M.I. Jordan, M.J. Kearns, S.A. Solla (eds.) Advances in Neural Information Processing Systems, 10. MIT Press, 1998.

    Google Scholar 

  • Diebolt, J. and Robert, C.P. (1994) Estimation of finite mixture distributions through Bayesian sampling. J.R. Statist. Soc. B. 56, 363–375.

    MathSciNet  MATH  Google Scholar 

  • Ghahramani, Z. and Beal, M.J. (2000) Variational inference for Bayesian mixture of factor analysers. In S.A. Solla, T.K. Leen and K.-R. Müller (eds.) Advances in Neural Information Processing Systems, 12. MIT Press (to appear).

    Google Scholar 

  • Holst, U. and Lindgren, G. (1991) Recursive estimation in mixture models with Markov regime. IEEE Transactions on Information Theory, 37, 6, 1683–1690.

    Article  MathSciNet  MATH  Google Scholar 

  • Humphreys, K. and Titterington, D.M. (2000) Some examples of recursive variational approximations. In preparation.

    Google Scholar 

  • MacKay, D.J.C. (1997) Ensemble learning for hidden Markov models. Technical report, Cavendish Laboratory, University of Cambridge.

    Google Scholar 

  • Titterington, D.M., Smith, A.F.M. and Makov, U.E. (1985) Statistical Analysis of Finite Mixture Distributions. New York: Wiley.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Humphreys, K., Titterington, D.M. (2000). Approximate Bayesian inference for simple mixtures. In: Bethlehem, J.G., van der Heijden, P.G.M. (eds) COMPSTAT. Physica, Heidelberg. https://doi.org/10.1007/978-3-642-57678-2_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-57678-2_42

  • Publisher Name: Physica, Heidelberg

  • Print ISBN: 978-3-7908-1326-5

  • Online ISBN: 978-3-642-57678-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics