Advertisement

Clustering Properties of Hierarchical Self-Organizing Maps

  • Jouko Lampinen
  • Erkki Oja

Abstract

A multilayer hierarchical self-organizing map (HSOM) is discussed as an unsupervised clustering method. The HSOM is shown to form arbitrarily complex clusters, in analogy with multilayer feedforward networks. In addition, the HSOM provides a natural measure for the distance of a point from a cluster that weighs all the points belonging to the cluster appropriately. In experiments with both artificial and real data it is demonstrated that the multilayer SOM forms clusters that match better to the desired classes than do direct SOM’s, classical k-means, or Isodata algorithms.

Key words

cluster analysis self-organizing maps neural networks 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    R. Hecht-Nielsen, “Theory of backpropagation neural network,” in Proc. IEEE Int. Joint Conf. on Neural Networks, vol. I, Washington, DC, 1989, pp. 593–611.Google Scholar
  2. 2.
    R. Hecht-Nielsen, Neurocomputing, Addison-Wesley: Reading, MA, 1990.Google Scholar
  3. 3.
    K. Hornik, M. Stinchcombe, and H. White, “Multi-layer feedforward networks are universal approximators,” Neural Net., vol. 2, pp. 359–366, 1989.CrossRefGoogle Scholar
  4. 4.
    M.D. Richard and R.P. Lippmann, “Neural network classifiers estimate Bayesian a posteriori probabilities,” Neural Comput., vol. 3, 1991, pp. 461–483.CrossRefGoogle Scholar
  5. 5.
    T. Kohonen, Self-Organization and Associative Memory, Springer-Verlag: Berlin, 1989.CrossRefGoogle Scholar
  6. 6.
    T. Kohonen, “Self-organized formation of topologically correct feature maps,” Biol. Cybernet., vol. 43, pp. 59–69, 1982.MathSciNetzbMATHCrossRefGoogle Scholar
  7. 7.
    S. Amari, “Topographic organization of nerve fields,” Bull. Math. BioL, vol. 42, pp. 339–364, 1980.MathSciNetzbMATHGoogle Scholar
  8. 8.
    H. Ritter and K. Schulten, “Kohonen’s self-organizing maps: exploring their computational capabilities,” in Proc. IEEE Int. Joint Conf. on Neural Networks, vol. 1, San Diego, CA, 1988, pp. 109–116.CrossRefGoogle Scholar
  9. 9.
    S.P. Luttrell, “Self-organisation: A derivation from first principles of a class of learning algorithms,” Proc. IEEE Int. Joint Conf. on Neural Networks, vol. 2, Washington, DC, 1989, pp. 495–498.CrossRefGoogle Scholar
  10. 10.
    T. Kohonen, “Self-organizing maps: optimization approaches,” in Artificial Neural Networks, vol. 2, T. Kohonen, K. Mäkisara, J. Kangas, and O. Simula, eds., North-Holland: Amsterdam, 1991, pp. 981–990.Google Scholar
  11. 11.
    P.A. Devijver and J. Kittler, Pattern Recognition: A Statistical Approach, Prentice Hall: London, 1982.zbMATHGoogle Scholar
  12. 12.
    J.0 Gower and G.J.S Ross, “Minimum spanning trees and single linkage cluster analysis,” Appl. Statist., vol. 18, pp. 54–64, 1969.MathSciNetCrossRefGoogle Scholar
  13. 13.
    S.P. Luttrell, “Image compression using a multilayer neural network,” Pattern Recog. Lett., vol. 10, pp. 1–7, 1989.zbMATHCrossRefGoogle Scholar
  14. 14.
    J. Lampinen, “Distortion tolerant pattern recognition using invariant transformations and hierarchical SOFM clustering,” in Artificial Neural Networks, vol. 1, T. Kohonen, K. Mäkisara, J. Kangas, and O. Simula, eds., North-Holland: Amsterdam, 1991, pp. 99–104.Google Scholar

Copyright information

© Springer Science+Business Media New York 1993

Authors and Affiliations

  • Jouko Lampinen
    • 1
  • Erkki Oja
    • 1
  1. 1.Department of Information TechnologyLappeenranta University of TechnologyLappeenrantaFinland

Personalised recommendations