Skip to main content

Number of Components and Initialization in Gaussian Mixture Model for Pattern Recognition

  • Conference paper
Artificial Neural Nets and Genetic Algorithms

Abstract

Number of components and initial parameter estimates are of crucial importance for sueeessful mixture estimation using Expectation-Maximization (EM) algorithm. In the paper a method for the complete mixture initialization based on a product kernel estimate of probability density function is proposed. The mixture components are assumed here to correspond to local maxima of optimally smoothed kerne Idensity estimate. The gradient method is used for local extrema finding. Then, local extrema are grouped together to form component eandidates and these are merged by the 4hierarchical clustering method. Finally, the initial mixture parameters are estimated. A comparison to scale-space approaches for finding of the number of components is given on examples.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. G. Celeux and G. Soromenho. An entropy criterion for assessing the number of clusters in a mixture model. Journal of Classification, 13: 195–212, 1996.

    Article  MathSciNet  MATH  Google Scholar 

  2. S. V. Chakravarthy and J. Ghosh. Scale-Based Clustering Using the Radial Basis Function Network. IEEE Trans. on Neural Networks, 7(5):1250–1261, 1996.

    Article  Google Scholar 

  3. A. Dempster, N. Laird, and D. Rubin. Maximum likelihood from incomplete data via EM algorithm. J. Royal Stat.Soc. vol.39, pp. 1–38, 1977.

    MathSciNet  MATH  Google Scholar 

  4. J. Grim, J. Novovicova, P. Pudil, P. Somol, and F. Ferri. Initialization normal mixtures of densities. In Proceedings of the 14th ICPR, pages 886–890, Australia, 1998.

    Google Scholar 

  5. N. Kehtamavaz and E. Nakamura. Generalization of the EM algorithm for mixture density estimation. Pattern Recognition Letters, 19:133–140, 1998.

    Article  Google Scholar 

  6. R. Kothari and D. Pitts. On finding the number of clusters. Pattern Recognition Letters, 20:405–416, 1999.

    Article  Google Scholar 

  7. T. Lindeberg. Scale-space theory: A basic tool for analysing structures at different scales. Journal of Applied Statistics, 21(2):225–270, 1994.

    Article  Google Scholar 

  8. P. Paclfk, J. Novovicova, P. Pudil, and P. Somol. Road Sign Classification using Laplace Kernel Classifier. Pattern Recognition Letters, 21(13–14): 1165–1173, 2000.

    Article  Google Scholar 

  9. H. Tenmoto, M. Kudo, and M. Shimbo. MDL-Based Selection of the Number of Components in Mixture Models for Pattern Recognition. In Lecture Notes in Computer Science 1451: Advances in Pattern Recognition, pages 831–836, 1998.

    Google Scholar 

  10. D. Titterington, A. Smith, and U. Makov. Statistical analysis of finite mixture distributions. John Wiley & Sons: Chichecter, Singapore, New York, 1985.

    MATH  Google Scholar 

  11. N. Ueda and R. Nakano. Deterministic annealing EM algorithm. Neural Networks, (11):271–282, 1998.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Wien

About this paper

Cite this paper

Paclík, P., Novovičová, J. (2001). Number of Components and Initialization in Gaussian Mixture Model for Pattern Recognition. In: Kůrková, V., Neruda, R., Kárný, M., Steele, N.C. (eds) Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6230-9_101

Download citation

  • DOI: https://doi.org/10.1007/978-3-7091-6230-9_101

  • Publisher Name: Springer, Vienna

  • Print ISBN: 978-3-211-83651-4

  • Online ISBN: 978-3-7091-6230-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics