Task Decomposition and Correlations in Growing Artificial Neural Networks
To reduce the engineering efforts for the design of neural network architectures a data driven algorithm is desirable which constructs a network during the learning process. For structure adaptation different approaches with evolutionary algorithms (Voigt et. al., 1993), growth algorithms (Fahlmann et. al., 1990), and others are used.
KeywordsHide Layer Residual Error Input Space Window Function Hide Unit
Unable to display preview. Download preview PDF.
- [S.E. Fahlman & C. Lebiere, 1990] “The Cascade-Correlation Learning Architecture”, in Touretzky (ed.) Advances in Neural Information Processing Systems 2, Morgan-KaufmannGoogle Scholar
- [M. Jordan & R. A. Jacobs, 1992] “Hierarchies of adaptive experts”, in Proceedings of 1992 Neural Information Processing Systems Conference, vol. 4, pp. 985–992Google Scholar
- [E. Littman & H. Ritter, 1993] “Generalization Abilities of Cascade Network Architecture”, in Artificial Neural Networks 5Google Scholar
- [F. Smieja & H. Mühlenbein, 1992] Reflective modular neural network systems, GMD Technical Report 633, Sankt Augustin, GermanyGoogle Scholar
- [H.-M. Voigt, J. Born & I.Santibanez-Koref, 1993] Evolutionary Structuring of Artificial Neural Networks, Technical University Berlin, Bionics and Evolution Techniques Lab, Technical Report TR-93–002Google Scholar