Skip to main content

Some Aspects of Latent Structure Analysis

  • Conference paper
Subspace, Latent Structure and Feature Selection (SLSFS 2005)

Abstract

Latent structure models involve real, potentially observable variables and latent, unobservable variables. The framework includes various particular types of model, such as factor analysis, latent class analysis, latent trait analysis, latent profile models, mixtures of factor analysers, state-space models and others. The simplest scenario, of a single discrete latent variable, includes finite mixture models, hidden Markov chain models and hidden Markov random field models. The paper gives a brief tutorial of the application of maximum likelihood and Bayesian approaches to the estimation of parameters within these models, emphasising especially the fact that computational complexity varies greatly among the different scenarios. In the case of a single discrete latent variable, the issue of assessing its cardinality is discussed. Techniques such as the EM algorithm, Markov chain Monte Carlo methods and variational approximations are mentioned.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bartholomew, D.J.: The foundations of factor analysis. Biometrika 71, 221–232 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  2. Gibson, W.A.: Three multivariate models: factor analysis, latent structure analysis and latent profile analysis. Psychometrika 24, 229–252 (1959)

    Article  MathSciNet  MATH  Google Scholar 

  3. Ghahramani, Z.: Factorial learning and the EM algorithm. In: Tesauro, G., Touretzky, D.S., Leen, T.K. (eds.) Advances in Neural Information Processing Systems, vol. 7. MIT Press, Cambridge (1996)

    Google Scholar 

  4. Bartholomew, D.J.: Latent Variable Models and Factor Analysis. Griffin, London (1987)

    MATH  Google Scholar 

  5. MacKay, D.J.C.: Bayesian neural networks and density networks. Instr. Meth. Phys. Res. A 354, 73–80 (1995)

    Article  Google Scholar 

  6. Hagenaars, J.A.: Categorical Longitudinal Data. Sage, London (1990)

    Google Scholar 

  7. Neal, R.M.: Probabilistic inference using Markov chain Monte Carlo methods. Tech. Report CRG-TR-93-1, Dept. Comp. Sci., Univ. Toronto (1993)

    Google Scholar 

  8. Dunmur, A.P., Titterington, D.M.: Analysis of latent structure models with multidimensional latent variables. In: Kay, J.W., Titterington, D.M. (eds.) Statistics and Neural Networks: Recent Advances at the Interface, pp. 165–194. Oxford University Press, Oxford (1999)

    Google Scholar 

  9. Ghahramani, Z., Beal, M.: Variational inference for Bayesian mixtures of factor analyzers. In: Solla, S.A., Leen, T.K., Müller, K.-R. (eds.) Advances in Neural Information Processing, vol. 12, pp. 449–455. MIT Press, Cambridge (2000)

    Google Scholar 

  10. Fokoué, E., Titterington, D.M.: Mixtures of factor analysers: Bayesian estimation and inference by stochastic simulation. Machine Learning 50, 73–94 (2003)

    Article  MATH  Google Scholar 

  11. Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Patt. Anal. Mach. Intell. 6, 721–741 (1984)

    Article  MATH  Google Scholar 

  12. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm (with discussion). J. R. Statist. Soc. B 39, 1–38 (1977)

    MathSciNet  MATH  Google Scholar 

  13. Rabiner, L.: A tutorial on hidden Markov models and selected applications in speech recognition. Proc. IEEE 77, 257–285 (1989)

    Article  Google Scholar 

  14. Younes, L.: Parameter estimation for imperfectly observed Gibbsian fields. Prob. Theory Rel. Fields 82, 625–645 (1989)

    Article  MATH  Google Scholar 

  15. Geyer, C.J., Thompson, E.A.: Constrained Monte Carlo maximum likelihood for dependent data (with discussion). J.R. Statist. Soc. B 54, 657–699 (1992)

    MathSciNet  Google Scholar 

  16. Qian, W., Titterington, D.M.: Estimation of parameters in hidden Markov models. Phil. Trans. R. Soc. Lond. A 337, 407–428 (1991)

    Article  MATH  Google Scholar 

  17. Besag, J.E.: On the statistical analysis of dirty pictures (with discussion). J.R. Statist. Soc. B 48, 259–302 (1986)

    MATH  Google Scholar 

  18. Besag, J.E.: Statistical analysis of non-lattice data. The Statistician 24, 179–195 (1975)

    Article  Google Scholar 

  19. Hall, P., Humphreys, K., Titterington, D.M.: On the adequacy of variational lower bounds for likelihood-based inference in Markovian models with missing values. J. R. Statist. Soc. B 64, 549–564 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  20. Bishop, C.M., Lawrence, N., Jaakkola, T.S., Jordan, M.I.: Approximating posterior distributions in belief networks using mixtures. In: Jordan, M.I., Kearns, M.J., Solla, S.A. (eds.) Advances in Neural Information Processing Systems, vol. 10, pp. 416–422. MIT Press, Cambridge (1998)

    Google Scholar 

  21. Humphreys, K., Titterington, D.M.: Improving the mean field approximation in belief networks using Bahadur’s reparameterisation of the multivariate binary distribution. Neural Processing Lett. 12, 183–197 (2000)

    Article  MATH  Google Scholar 

  22. Wainwright, M.J., Jordan, M.I.: Graphical models, exponential families, and variational approximations. Technical Report 649, Dept. Statistics, Univ. California, Berkeley (2003)

    Google Scholar 

  23. Jordan, M.I., Gharamani, Z., Jaakkola, T.S., Saul, L.K.: An introduction to variational methods for graphical models. In: Jordan, M. (ed.) Learning in Graphical Models, pp. 105–162. MIT Press, Cambridge (1999)

    Google Scholar 

  24. Zhang, J.: The Mean Field Theory in EM procedures for Markov random fields. IEEE Trans. Signal Processing 40, 2570–2583 (1992)

    Article  MATH  Google Scholar 

  25. Zhang, J.: The Mean Field Theory in EM procedures for blind Markov random field image restoration. IEEE Trans. Image Processing 2, 27–40 (1993)

    Article  Google Scholar 

  26. Robert, C.P.: The Bayesian Choice, 2nd edn. Springer, Heidelberg (2001)

    Google Scholar 

  27. Tierney, L., Kadane, J.B.: Accurate approximations to posterior moments and marginal densities. J. Amer. Statist. Assoc. 81, 82–86 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  28. Diebolt, J., Robert, C.P.: Estimation of finite mixture distributions through Bayesian sampling. J.R. Statist. Soc. B 56, 363–375 (1994)

    MathSciNet  MATH  Google Scholar 

  29. Robert, C.P., Celeux, G., Diebolt, J.: Bayesian estimation of hidden Markov chains: a stochastic implementation. Statist. Prob. Lett. 16, 77–83 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  30. Rydén, T., Titterington, D.M.: Computational Bayesian analysis of hidden Markov models. J. Comp. Graph. Statist. 7, 194–211 (1998)

    MathSciNet  Google Scholar 

  31. Gelfand, A.E., Smith, A.F.M.: Sampling-based approaches to calculating marginal densities. J. Amer. Statist. Assoc. 85, 398–409 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  32. Gilks, W.R., Richardson, S., Spiegelhalter, D.J. (eds.): Markov Chain Monte Carlo in Practice. Chapman and Hall, Boca Raton

    Google Scholar 

  33. Doucet, A., de Freitas, N., Gordon, N. (eds.): Sequential Monte Carlo Methods in Practice. Springer, Heidelberg

    Google Scholar 

  34. Murray, I., Ghahramani, Z.: Bayesian learning in undirected graphical models: approximate MCMC algorithms. In: Chickering, M., Halperin, J. (eds.) Proc. 20th Conf. Uncertainty in Artificial Intell., pp. 577–584. AUAI Press (2004)

    Google Scholar 

  35. Corduneanu, A., Bishop, C.M.: Variational Bayesian model selection for mixture distributions. In: Richardson, T., Jaakkola, T. (eds.) Proc. 8th Int. Conf. Artific. Intell. Statist., pp. 27–34. Morgan Kaufmann, San Mateo (2001)

    Google Scholar 

  36. Ueda, N., Ghahramani, Z.: Bayesian model search for mixture models based on optimizing variational bounds. Neural Networks 15, 1223–1241 (2003)

    Article  Google Scholar 

  37. MacKay, D.J.C.: Ensemble learning for hidden Markov models. Technical Report, Cavendish Lab., Univ. Cambridge (1997)

    Google Scholar 

  38. McGrory, C.A.: Ph.D. Dissertation, Dept. Statist., Univ. Glasgow (2005)

    Google Scholar 

  39. Wang, B., Titterington, D.M.: Convergence properties of a general algorithm for calculating variational Bayesian estimates for a normal mixture model. Bayesian Analysis 1 (to appear, 2006)

    Google Scholar 

  40. Wang, B., Titterington, D.M.: Variational Bayes estimation of mixing coefficients. In: Winkler, J.R., Niranjan, M., Lawrence, N.D. (eds.) Deterministic and Statistical Methods in Machine Learning. LNCS, vol. 3635, pp. 281–295. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  41. Titterington, D.M.: Bayesian methods for neural networks and related models. Statist. Sci. 19, 128–139 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  42. Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Petrov, B.N., Csaki, F. (eds.) Proc. 2nd Int. Symp. Info. Theory, pp. 267–281. Akadémiai Kiadó, Budapest (1973)

    Google Scholar 

  43. Schwarz, G.: Estimating the dimension of a model. Ann. Statist. 6, 461–466 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  44. McLachlan, G.J.: On bootstrapping the likelihood ratio test statistics for the number of components in a normal mixture. Appl. Statist. 36, 318–324 (1987)

    Article  Google Scholar 

  45. Keribin, C.: Consistent estimation of the order of mixture models. Sankhya A 62, 49–66 (2000)

    MathSciNet  MATH  Google Scholar 

  46. Kass, R.E., Raftery, A.: Bayes factors. J. Amer. Statist. Assoc. 90, 773–795 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  47. Spiegelhalter, D.J., Best, N.G., Carlin, B.P., van der Linde, A.: Bayesian measures of complexity and fit (with discussion). J. R. Statist. Soc. B 64, 583–639 (2002)

    Article  MATH  Google Scholar 

  48. Celeux, G., Forbes, F., Robert, C.P., Titterington, D.M.: Deviation information criteria for missing data models (submitted, 2005)

    Google Scholar 

  49. Green, P.J.: Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82, 711–732 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  50. Richardson, S., Green, P.J.: On Bayesian analysis of mixtures with an unknown number of components (with discussion). J. R. Statist. Soc. B 59, 731–792 (1997)

    Article  MATH  Google Scholar 

  51. Robert, C.P., Rydén, T., Titterington, D.M.: Bayesian inference in hidden Markov models through the reversible jump Markov chain Monte Carlo method. J. R. Statist. Soc. B 62, 57–75 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  52. Green, P.J., Richardson, S.: Hidden Markov models and disease mapping. J. Amer. Statist. Assoc. 97, 1055–1070 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  53. Stephens, M.: Bayesian analysis of mixtures with an unknown number of components - an alternative to reversible jump methods. Ann. Statist. 28, 40–74 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  54. Cappé, O., Robert, C.P., Rydén, T.: Reversible jump, birth-and-death and more general continuous time Markov chain Monte Carlo. J. R. Statist. Soc. B 65, 679–699 (2003)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Titterington, D.M. (2006). Some Aspects of Latent Structure Analysis. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds) Subspace, Latent Structure and Feature Selection. SLSFS 2005. Lecture Notes in Computer Science, vol 3940. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11752790_4

Download citation

  • DOI: https://doi.org/10.1007/11752790_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34137-6

  • Online ISBN: 978-3-540-34138-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics