Abstract
In this paper, we propose a new information-theoretic method to interpret competitive learning. The method is called ”pseudo-network growing,” because a network re-grows gradually after learning, taking into account the importance of components. In particular, we try to apply the method to clarify the class structure of self-organizing maps. First, the importance of input units is computed, and then input units are gradually added, according to their importance. We can expect that the corresponding number of competitive units will be gradually increased, showing the main characteristics of network configurations and input patterns. We applied the method to the well-known Senate data with two distinct classes. By using the conventional SOM, explicit class boundaries could not be obtained, due to the inappropriate map size imposed in the experiment. However, with the pseudo-network growing, a clear boundary could be observed in the first growing stage, and gradually the detailed class structure could be reproduced.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Alexander, J.A., Mozer, M.C.: Template-based procedures for neural network interpretation. Neural Networks 12, 479–498 (1999)
Ishikawa, M.: Rule extraction by successive regularization. Neural Networks 13, 1171–1183 (2000)
Marsland, S., Shapiro, J., Nehmzow, U.: A self-organizing network that grows when required. Neural Networks 15, 1041–1058 (2002)
Fritzke, B.: Growing cell structures – a self-organizing network for unsupervised and supervised learning. Neural Networks 7(9), 1441–1460 (1994)
Fritzke, B.: Growing self-organizing networks – why? In: ESANN 1996: European Symposium on Artificial Neural Networks, pp. 61–72 (1996)
Kamimura, R.: Information-theoretic competitive learning with inverse Euclidean distance output units. Neural Processing Letters 18, 163–184 (2003)
Anderson, J.R.: Cognitive Psychology and its Implication. Worth Publishers, New York (1980)
Korsten, N.J.H., Fragopanagos, N., Hartle, M., Taylor, N., Taylor, J.G.: Attention as a controller. Neural Networks 19, 1408–1421 (2006)
Hamker, F.H., Zirnsak, M.: V4 receptive field dynamics as predicted by a systems-level model of visual attention using feedback from the frontal eye field. Neural Networks 19, 1371–1382 (2006)
Romesburg, H.C.: Cluster Analysis for Researchers. Krieger Publishing Company, Florida (1984)
Vesanto, J., Himberg, J., Alhoniemi, E., Parhankangas, J.: SOM toolbox for Matlab. tech. rep., Laboratory of Computer and Information Science, Helsinki University of Technology (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kamimura, R. (2010). Pseudo-network Growing for Gradual Interpretation of Input Patterns. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds) Neural Information Processing. Models and Applications. ICONIP 2010. Lecture Notes in Computer Science, vol 6444. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17534-3_46
Download citation
DOI: https://doi.org/10.1007/978-3-642-17534-3_46
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-17533-6
Online ISBN: 978-3-642-17534-3
eBook Packages: Computer ScienceComputer Science (R0)