Abstract
This paper deals with optimal learning and provides a unified viewpoint of most significant results in the field. The focus is on the problem of local minima in the cost function that is likely to affect more or less any learning algorithm. We give some intriguing links between optimal learning and the computational complexity of loading problems. We exhibit a computational model such that the solution of all loading problems giving rise to unimodal error functions require the same time, thus suggesting that they belong to the same computational class.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
J. Barhen, J. W. Burdick, and B. C. Cetin, Terminal Repeller Unconstrained Subenergy Tunneling (TRUST) for fast global optimization, Journal of Optimization Theory and Applications, Vol.77 (1993), pp97–126.
M. Bianchini, P. Frasconi, and M. Gori, Learning without local minima in radial basis function networks, IEEE Transactions on Neural Networks, Vol. 6, (1995), pp749–756.
M. Bianchini, P. Frasconi, and M. Gori, Learning in multilayered networks used as autoassociators, IEEE Transactions on Neural Networks, Vol. 6, (1995), pp512–515.
M. Bianchini, M. Gori, and M. Maggini, Does terminal attractor backpropagation guarantee global optimization?, in International Conference on Artificial Neural Networks, Springer-Verlag, (1994), pp377–380.
M. Bianchini and M. Gori, Optimal learning in artificial neural networks: A review of theoretical results, Neurocomputing, Vol.13 (October 1996), No.5, pp313–346.
J. Chao, W. Ratanasuwan, and S. Tsujii, How to find global minima in finite times of search for multilayer perceptrons training, in International Joint Conference on Neural Networks, IEEE Press, Singapore, (1991), pp1079–1083.
P. Frasconi and M. Gori, Multilayered networks and the C-G uncertainty principle, in SPIE International Conference, Science of Artificial Neural Networks, Orlando, Florida, 1993, pp396–401.
M. Gori and A. Tesi, On the problem of local minima in backpropagation, Transactions on Pattern Analysis and Machine Intelligence, Vol. 14 (1992), pp76–86.
J. S. Judd, Neural Network Design and the Complexity of Learning. Cambridge (1990), London: The MIT Press.
T. Samad, Backpropagation improvements based on heuristic arguments, in International Joint Conference on Neural Networks, IEEE Press, Washington DC, (1990), pp565–568.
Torn and Zilinkas, Global Optimization, Lecture Notes in Computer Sciences, (1987).
S. Wang and C. H. Hsu, Terminal attractor learning algorithms for backpropagation neural networks, in International Joint Conference on Neural Networks, IEEE Press, Singapore, 1991, pp183–189.
H. White, The learning rate in backpropagation systems: an application of Newton’s method, in International Joint Conference on Neural Networks, IEEE Press, Singapore, 1991, pp679–684.
X. Yu, Can backpropagation error surface not have local minima?, IEEE Transactions on Neural Networks, Vol. 3 (1992), pp1019–1020.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer Science+Business Media New York
About this chapter
Cite this chapter
Bianchini, M., Fanelli, S., Gori, M., Protasi, M. (1997). Unimodal Loading Problems. In: Ellacott, S.W., Mason, J.C., Anderson, I.J. (eds) Mathematics of Neural Networks. Operations Research/Computer Science Interfaces Series, vol 8. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-6099-9_15
Download citation
DOI: https://doi.org/10.1007/978-1-4615-6099-9_15
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-7794-8
Online ISBN: 978-1-4615-6099-9
eBook Packages: Springer Book Archive