Skip to main content

A Novel Global Hybrid Algorithm for Feedforward Neural Networks

  • Conference paper
  • 1974 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4493))

Abstract

A novel global optimization hybrid algorithm was presented for training neural networks in this paper. During the course of neural networks training, when the weights are being adjusted with Quasi-Newton(QN) method, the error function may be stuck in a local minimum. In order to solve this problem, a original Filled-Function was created and proved. It was combined with QN method to become a global optimization hybrid algorithm. When the net is trained with our new hybrid algorithm, if error function was tripped in a local minimal point, the new hybrid algorithm was able to help networks out of the local minimal point. After that, the weights could being adjusted until the global minimal point for weights vector was found. One illustrative example is used to demonstrate the effectiveness of the presented scheme.

This work is supported by national natural science foundation of P.R. China Grant #60674063, by national postdoctoral science foundation of P.R. China Grant #2005037755, by natural science foundation of Liaoning Province Grant #20062024.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Li, H., Wan, B.: A New Global Optimization Algorithm for Training Feedforward Neural Networks and Its Application. System Engineering Theory and Practice 8, 42–47 (2003)

    Google Scholar 

  2. Gao, M., Liu, X., Li, S.: A New Global Optimization BP Neural Networks. Journal of Binzhou Teachers College 20, 37–41 (2004)

    Google Scholar 

  3. Jiang, H.: Study on Class of Filled-Functions for Global Optimization. Journal of Harbin University of Commerce (Natural Sciences Edition) 21, 230–232 (2005)

    Google Scholar 

  4. Horinik, K., Stinchcombe, M., White, H.: Multilayer Feed Forward Networks Are Universal Approximators. Neural Networks 2, 359–366 (1989)

    Article  Google Scholar 

  5. Andersen, R.S., Bloomfield, P.: Properties of the Random Search in Global Optimization. Journal of Optimization Theory and Applications 16, 383–398 (1975)

    Article  MathSciNet  Google Scholar 

  6. David, J.J., Frenzel, J.F.: Training Product Unit Neural Networks with Genetic Algorithms. IEEE Expert 8, 26–33 (1993)

    Google Scholar 

  7. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by Simulated Annealing. Science 220, 671–680 (1983)

    Article  MathSciNet  Google Scholar 

  8. Zheng, Y.: A Class of Filled-Functions for Global Optimization. Mathematic and Physical Journal 14, 184–189 (1994)

    MATH  Google Scholar 

  9. Levy, A.V.: The Tunneling Method Applied to Global optimization presented at the SIAM Conference on Numerical Optimization, Boulder, Colorado, U.S.A., vol. 4, pp. 12–14 (June 1984)

    Google Scholar 

  10. He, Y.: A Downstairs Method for Finding a Global Minimizer of a Function of Several variables. Journal on Numerical Methods and Computer Applications 10, 56–63 (1989)

    Google Scholar 

  11. Shubert, B.O.: A Sequential Method Seeking the Global Maximum of a Function. SIAM J. Numer. Anal. 9, 379–388 (1972)

    Article  MATH  MathSciNet  Google Scholar 

  12. Kong, M., Zhuang, J.: A Modified Filled Function Method for Finding a Global Minimizer of a Nonsmooth Function of Several Variables. Numerical Mathematics A Journal of Chinese Universities 2, 165–174 (1996)

    MathSciNet  Google Scholar 

  13. Ge, R.: A Filled Function Method for Finding a Global Minimizer of a Function of Several Variables. In: The Dundee Biennial Conference on Numerical Analysis, pp. 56–63 (1983)

    Google Scholar 

  14. Zhang, L., Li, D., Tian, W.: A New Filled Function Method for Global Optimization. Journal of Global Optimization 19, 37–43 (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Derong Liu Shumin Fei Zengguang Hou Huaguang Zhang Changyin Sun

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Li, H., Li, H., Du, Y. (2007). A Novel Global Hybrid Algorithm for Feedforward Neural Networks. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4493. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72395-0_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72395-0_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72394-3

  • Online ISBN: 978-3-540-72395-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics