Skip to main content
Log in

A Method of Accelerating Neural Network Learning

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

The article presents of accelerating neural network learning by the Back Propagation algorithm and one of its fastest modifications – the Levenberg–Marqurdt method. The learning is accelerated by introducing the ‘single-direction’ coefficient of the change of x for calculating its new values (the number of iterations is decreased by approximately 30%). Simulation results of learning neural networks by applying both the classic method and the method of accelerating the procedure are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bishop, C. M.: Neural Networks for Pattern Recognition, Oxford university press, ISBN 0 19 853864 2, 2000.

  2. Haykin, S. Neural Networks: A Comprehensive Foundation, Prentice Hall, New Jersey, 1999.

  3. J. P. Webros (1974) Beyond regression: New tool for prediction and analysis in behavioral science Harvard University Cambridge, MA

    Google Scholar 

  4. D. B. Parker (1985) Learning-logic: Casting the cortex of the human brain in silicon, Technical report TR-47, Center for computational research in Economics and Management Science MIT Cambridge, MA

    Google Scholar 

  5. M. T. Hagan H. B. Demuth M. Beale (1996) Neural Network Design PWS Publishing Company Boston

    Google Scholar 

  6. D. E. Rumelhart G. E. Hinton R. J. Williams (1986) ArticleTitleLearning representation by back-propagation errors Nature 323 533–536 Occurrence Handle10.1038/323533a0

    Article  Google Scholar 

  7. S. V. Kamarthi S. Pittner (1999) ArticleTitleAccelerating neural network training using weight extrapolations Neural Networks 12 IssueID9 1285–1299 Occurrence Handle10.1016/S0893-6080(99)00072-6 Occurrence Handle12662633

    Article  PubMed  Google Scholar 

  8. M.R. Meybodi H. Beigy (2002) ArticleTitleNew learning automata based algorithms for adaptation of backpropagation algorithm parameters International Journal of Neural Systems 12 IssueID1 45–67 Occurrence Handle11852444

    PubMed  Google Scholar 

  9. D. M. Bates D. G. Watts (1988) Nonlinear Regression and Its Applications Wiley New York

    Google Scholar 

  10. P. R. Gill W. Murray M.H. Wright (1981) The Levenberg–Marquardt Method §4. 7.3 in Practical Optimization Academic Press London 136–137

    Google Scholar 

  11. K. Levenberg (1944) ArticleTitleA method for the solution of certain problems in least squares Quarterly Applied Mathematics 2 164–168

    Google Scholar 

  12. D. Marquardt (1963) ArticleTitleAn algorithm for least-squares estimation of nonlinear parameters SIAM Journal of Applied Mathematics 11 431–441 Occurrence Handle10.1137/0111030

    Article  Google Scholar 

  13. Tai-cong, C., Da-jian H., Au, F. T. K. and Tham, L.G.: Acceleration of Levenberg– Marquardt training of neural networks with variable decay rate, Proceedings of the International Joint Conference on Neural Networks, 3 (2003) pp. 1873–1878.

  14. Bogdan, M. W., Serdar, I., Okyay, K., Önder Efe, M.: An Algorithm for Fast Convergence in Training Neural Networks, International Joint Conference on Neural Networks (IJCNN01), Washington D.C, July 15–19, 2001.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sotir Sotirov.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sotirov, S. A Method of Accelerating Neural Network Learning. Neural Process Lett 22, 163–169 (2005). https://doi.org/10.1007/s11063-005-3094-9

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-005-3094-9

Keywords

Navigation