Skip to main content

Unimodal Loading Problems

  • Chapter
Mathematics of Neural Networks

Part of the book series: Operations Research/Computer Science Interfaces Series ((ORCS,volume 8))

Abstract

This paper deals with optimal learning and provides a unified viewpoint of most significant results in the field. The focus is on the problem of local minima in the cost function that is likely to affect more or less any learning algorithm. We give some intriguing links between optimal learning and the computational complexity of loading problems. We exhibit a computational model such that the solution of all loading problems giving rise to unimodal error functions require the same time, thus suggesting that they belong to the same computational class.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. J. Barhen, J. W. Burdick, and B. C. Cetin, Terminal Repeller Unconstrained Subenergy Tunneling (TRUST) for fast global optimization, Journal of Optimization Theory and Applications, Vol.77 (1993), pp97–126.

    Google Scholar 

  2. M. Bianchini, P. Frasconi, and M. Gori, Learning without local minima in radial basis function networks, IEEE Transactions on Neural Networks, Vol. 6, (1995), pp749–756.

    Article  Google Scholar 

  3. M. Bianchini, P. Frasconi, and M. Gori, Learning in multilayered networks used as autoassociators, IEEE Transactions on Neural Networks, Vol. 6, (1995), pp512–515.

    Article  Google Scholar 

  4. M. Bianchini, M. Gori, and M. Maggini, Does terminal attractor backpropagation guarantee global optimization?, in International Conference on Artificial Neural Networks, Springer-Verlag, (1994), pp377–380.

    Google Scholar 

  5. M. Bianchini and M. Gori, Optimal learning in artificial neural networks: A review of theoretical results, Neurocomputing, Vol.13 (October 1996), No.5, pp313–346.

    Google Scholar 

  6. J. Chao, W. Ratanasuwan, and S. Tsujii, How to find global minima in finite times of search for multilayer perceptrons training, in International Joint Conference on Neural Networks, IEEE Press, Singapore, (1991), pp1079–1083.

    Google Scholar 

  7. P. Frasconi and M. Gori, Multilayered networks and the C-G uncertainty principle, in SPIE International Conference, Science of Artificial Neural Networks, Orlando, Florida, 1993, pp396–401.

    Google Scholar 

  8. M. Gori and A. Tesi, On the problem of local minima in backpropagation, Transactions on Pattern Analysis and Machine Intelligence, Vol. 14 (1992), pp76–86.

    Article  Google Scholar 

  9. J. S. Judd, Neural Network Design and the Complexity of Learning. Cambridge (1990), London: The MIT Press.

    Google Scholar 

  10. T. Samad, Backpropagation improvements based on heuristic arguments, in International Joint Conference on Neural Networks, IEEE Press, Washington DC, (1990), pp565–568.

    Google Scholar 

  11. Torn and Zilinkas, Global Optimization, Lecture Notes in Computer Sciences, (1987).

    Google Scholar 

  12. S. Wang and C. H. Hsu, Terminal attractor learning algorithms for backpropagation neural networks, in International Joint Conference on Neural Networks, IEEE Press, Singapore, 1991, pp183–189.

    Google Scholar 

  13. H. White, The learning rate in backpropagation systems: an application of Newton’s method, in International Joint Conference on Neural Networks, IEEE Press, Singapore, 1991, pp679–684.

    Google Scholar 

  14. X. Yu, Can backpropagation error surface not have local minima?, IEEE Transactions on Neural Networks, Vol. 3 (1992), pp1019–1020.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer Science+Business Media New York

About this chapter

Cite this chapter

Bianchini, M., Fanelli, S., Gori, M., Protasi, M. (1997). Unimodal Loading Problems. In: Ellacott, S.W., Mason, J.C., Anderson, I.J. (eds) Mathematics of Neural Networks. Operations Research/Computer Science Interfaces Series, vol 8. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-6099-9_15

Download citation

  • DOI: https://doi.org/10.1007/978-1-4615-6099-9_15

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4613-7794-8

  • Online ISBN: 978-1-4615-6099-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics