Variants of SOM
In order to create spatially ordered and organized representations of input occurrences in a neural network, the most essential principle seems to be to confine the learning corrections to subsets of network units that lie in the topological neighborhood of the best-matching unit. There seems to exist an indefinite number of ways to define the matching of an input occurrence with the internal representations, and even the neighborhood of a unit can be defined in many ways. It is neither necessary to define the corrections as gradient steps in the parameter space: improvements in matching may be achieved by batch computation or evolutionary changes. Consequently, all such cases will henceforth be regarded to belong to the broader category of the Self-Organizing Map (SOM) algorithms. This category may also be thought to contain both supervised and unsupervised learning methods.
KeywordsBasis Vector Receptive Field Minimal Span Tree Vector Quantization Stochastic Approximation
Unable to display preview. Download preview PDF.