Assessment of a Modified Version of the EM Algorithm for Remote Sensing Data Classification

  • Thales Sehn Korting
  • Luciano Vieira Dutra
  • Guaraci José Erthal
  • Leila Maria Garcia Fonseca
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6419)

Abstract

This work aims to present an assessment of a modified version of the standard EM clustering algorithm for remote sensing data classification. As observing clusters with very similar mean vectors but differing only on the covariance structure is not natural for remote sensing objects, a modification was proposed to avoid keeping clusters whose centres are too close. Another modification were also proposed to improve the EM initialization by providing results of the well known K-means algorithm as seed points and to provide rules for decreasing the number of modes once a certain a priori cluster probability is very low. Experiments for classifying Quickbird high resolution images of an urban region were accomplished. It was observed that this modified EM algorithm presented the best agreement with a reference map ploted on the scene when compared with standard K-means and SOM results.

Keywords

Mixture Model Gaussian Mixture Model International Computer Science Institute Open Source Approach Estimate Mixture Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Bilmes, J.: A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models. TR-97-021, International Computer Science Institute (1998)Google Scholar
  2. 2.
    Câmara, G., et al.: TerraLib: An Open Source GIS Library for Large-scale Environmental and Socio-economic Applications. Open Source Approaches in Spatial Data Handling, 247–270 (2008)Google Scholar
  3. 3.
    Dempster, A., Laird, N., Rubin, D.: Maximum likelihood estimation from incomplete data via the EM algorithm. Journal of the Royal Statistical Society 39 (1977)Google Scholar
  4. 4.
    Figueiredo: Lecture Notes on the EM Algorithm. Tech. rep., Institute of Tele-communication (May 19, 2004)Google Scholar
  5. 5.
    Figueiredo, M., Jain, A.: Unsupervised learning of finite mixture models. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(3), 381–396 (2002)CrossRefGoogle Scholar
  6. 6.
    Fraley, C.: How Many Clusters? Which Clustering Method? Answers Via Model-Based Cluster Analysis. The Computer Journal 41(8), 578–588 (1998)CrossRefMATHGoogle Scholar
  7. 7.
    Guo, L., Moore, J.: Post-classification Processing For Thematic Mapping Based On Remotely Sensed Image Data. In: Geoscience and Remote Sensing Symposium. IGARSS, vol. 4 (1991)Google Scholar
  8. 8.
    Kohonen, T.: Self-Organizing Maps. Springer, Heidelberg (2001)CrossRefMATHGoogle Scholar
  9. 9.
    McLachlan, G., Krishnan, T.: The EM Algorithm and Extensions, 1st edn. Wiley Interscience, Hoboken (1997)MATHGoogle Scholar
  10. 10.
    McLachlan, G., Peel, G., Basford, K., Adams, P.: The emmix software for the fitting of mixtures of normal and t-components. Journal of Statistical Software 4 (1999)Google Scholar
  11. 11.
    Moon, T.: The expectation-maximization algorithm. IEEE Signal Processing Magazine 13(6), 47–60 (1996)CrossRefGoogle Scholar
  12. 12.
    Theodoridis, S., Koutroumbas, K.: Pattern Recognition. Academic Press, London (2003)CrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Thales Sehn Korting
    • 1
  • Luciano Vieira Dutra
    • 1
  • Guaraci José Erthal
    • 1
  • Leila Maria Garcia Fonseca
    • 1
  1. 1.Image Processing DivisionNational Institute for Space Research – INPESão José dos CamposBrazil

Personalised recommendations