Variants of SOM

  • Teuvo Kohonen
Part of the Springer Series in Information Sciences book series (SSINF, volume 30)


In order to create spatially ordered and organized representations of input occurrences in a neural network, the most essential principle seems to be to confine the learning corrections to subsets of network units that lie in the topological neighborhood of the best-matching unit. There seems to exist an indefinite number of ways to define the matching of an input occurrence with the internal representations, and even the neighborhood of a unit can be defined in many ways. It is neither necessary to define the corrections as gradient steps in the parameter space: improvements in matching may be achieved by batch computation or evolutionary changes. Consequently, all such cases will henceforth be regarded to belong to the broader category of the Self-Organizing Map (SOM) algorithms. This category may also be thought to contain both supervised and unsupervised learning methods.


Basis Vector Receptive Field Minimal Span Tree Vector Quantization Stochastic Approximation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Teuvo Kohonen
    • 1
  1. 1.Neural Networks Research CentreHelsinki University of TechnologyEspooFinland

Personalised recommendations