Learning with Mappings and Input-Orderings using Random Access Memory — based Neural Networks
Random Access Memory (RAM)-based systems have been studied for several years. A recent paper by Gera and Sperduti demonstrates how one such RAM-based network can be used to produce a real-number ordering for each member of a training set. The coding reflects the relative similarity of input patterns in a highly concise way that is also very economical in its memory requirements. In fact, only one neuron or discriminator is required. However, the larger the training set, the closer some codes become. This in turn means that more decimal places are needed for the codes. One possible solution to this problem is described in this paper. A twostage learning method is used. In the first stage, a Kohonen-like RAM-network divides the training set into groupings of similar patterns. Each grouping is then associated with its own Gera/Sperduti discriminator. Since input patterns to each such discriminator are similar, the discriminator itself can be pruned to remove redundant information and maximise output variance.
KeywordsInput Pattern Memory Location Random Access Memory Input Unit Input Line
Unable to display preview. Download preview PDF.
- Aleksander I, ‘The Logic of Connectionist Systems’. In I. Aleksander (ed.) ‘Neural Computing Architectures: The Design of Brain-like Machines’, London: North Oxford Academic, 1989.Google Scholar
- Kohonen, T, ‘Self Organization and Associative Memory’. Springer-Verlag, 1984.Google Scholar
- Gera M H and Sperduti A, ‘Unsupervised and Mixed Learning using Input-Orderings with Weightless Neural Networks’. Proceedings the International Joint Conference on Neural Networks, Beijing, November, 1992.Google Scholar
- Robins AV, ‘Multiple Representations in Connectionist Systems’, International Journal of Neural Systems, 4: 345–362, 1992.Google Scholar