Assessing the Number of Clusters of the Latent Class Model

  • François-Xavier Jollois
  • Mohamed Nadif
  • Gérard Govaert
Conference paper
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)


When partitioning the data is the main concern, it is implicitly assumed that each cluster can be approximately regarded as a sample from one component of a mixture model. Thus, the clustering problem can be viewed as an estimation problem of the parameters of the mixture. Setting this problem under the Maximum likelihood and Classification likelihood approaches, we first study the clustering of objects described by categorical attributes using the latent class model and we concentrate our attention on the problem of the number of components. To this end, we use three criteria derived within a Bayesian framework to tackle this problem. These criteria based on approximations of integrated likelihood and of integrated classification likelihood have been recently compared in Gaussian mixture. In this work, we propose to extend these comparisons to the latent class model.


Mixture Model Bayesian Information Criterion Gaussian Mixture Model Latent Class Analysis Latent Class Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. BANFIELD, J.D. and RAFTERY, A.E. (1993): Model-based Gaussian and non-Gaussian Clustering. Biometrics, 49, 803–821.MathSciNetzbMATHCrossRefGoogle Scholar
  2. BIERNACKI, C. CELEUX, G. and GOVAERT, G. (2000): Assessing a Mixture Model for Clustering with the Integrated Completed Likelihood. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22, 7, 719–725.Google Scholar
  3. CELEUX, G. and GOVAERT, G. (1992): A Classification EM Algorithm for Clustering and two Stochastic Versions. Computational Statistics ê? Data Analysis, 14, 315–332.MathSciNetzbMATHCrossRefGoogle Scholar
  4. CELEUX, G. and GOVAERT, G. (1995): Gaussian Parsimonious Clustering Models. Pattern Recognition, 28, 781–793.CrossRefGoogle Scholar
  5. CHEESEMAN, P. and STUTZ, J. (1996): Bayesian Classification (AutoClass): Theory and Results. In Fayyad, U. and Pitesky-Shapiro, G. and Uthurusamy, R. (Eds.): Advances in Knowledge Discovery and Data Mining. AAAI Press, 6183.Google Scholar
  6. CHICKERING, D.M and HECKERMAN, D. (1997): Maximum Approximations for the Marginal Likelihood of Bayesian Networks with Hidden Variables. Machine Learning, 29, 181–212.zbMATHCrossRefGoogle Scholar
  7. DAY, N.E. (1969): Estimating the Components of a Mixture of Normal Distributions. Biometrika, 56, 464–474.Google Scholar
  8. DEMPSTER, A. P., LAIRD, N. M. and RUBIN, D. B. (1997): Maximum Likelihood for Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Society, 39, B, 1–38.Google Scholar
  9. DOMINGOS, P. and PAZZANI, M. (1997): Beyond Independence: Conditions for the Optimality of the Simple Bayesian Classifier. Machine Learning, 29, 103130.Google Scholar
  10. FRALEY, C. and RAFTERY, A.E. (1999): MCLUST: Software for Model-Based Cluster and Discriminant Analysis. Technical Report, 342, University of Washington.Google Scholar
  11. KASS, R. E. and RAFTERY, A. E. (1995): Bayes Factors. Journal of the American Statistical Association, 90, 773–795.zbMATHCrossRefGoogle Scholar
  12. LAZARFELD, P. F. and HENRY, N. W. (1968): Latent Structure Analysis. Houghton Mifflin, Boston.Google Scholar
  13. MCLACHLAN, G. J. and BASFORD, K. E. (1988): Mixture Models: Inference and Applications to Clustering. Marcel Dekker, New York.zbMATHGoogle Scholar
  14. MCLACHLAN, G. and PEEL, D. (1998): User’s Guide to EMMIX-Version 1.0. Technical Report, University of Queensland.Google Scholar
  15. SCHWARTZ, G. (1978): Estimating the Dimension of a Model. Annals of Statistics, 6, 461–464.MathSciNetCrossRefGoogle Scholar
  16. SYMONS, M. J. (1981): Clustering Criteria and Multivariate Normal Mixture. Biometrics, 27, 387–397.MathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • François-Xavier Jollois
    • 1
  • Mohamed Nadif
    • 1
  • Gérard Govaert
    • 2
  1. 1.Laboratoire d’Informatique Théorique et AppliquéeUniversité de MetzMetz CedexFrance
  2. 2.Heudiasyc, UMR CNRS 6599Université de Technologie de CompiègneCompiègne CedexFrance

Personalised recommendations