Advertisement

A Stochastic Optimization Approach for Training the Parameters in Neural Networks

  • Norio Baba
Conference paper
Part of the Lecture Notes in Economics and Mathematical Systems book series (LNE, volume 374)

Abstract

Recently, back-propagation method has often been applied to adapt artificial neural network for various pattern classification problems. However, an important limitation of this method is that it sometimes fails to find a global minimum of the total error function of neural network. In this paper, a hybrid algorithm which combines the modified back-propagation method and the random optimization method is proposed in order to find the global minimum of the total error function of neural network in a small number of steps. It is shown that this hybrid algorithm ensures convergence to a global minimum with probability 1 in a compact region of weight vector space. Further, several computer simulation results dealing with the problem of forcasting air pollution density, stock price, and etc. are given.

Keywords

Neural Network Global Minimum Weight Vector Stock Price Line Search 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    N. Baba, T. Shoman, and Y. Sawaragi, “A Modified Convergence Theorem for a Random Optimization Method”, Information Sciences, Vol. 13, pp.159 – pp.166, 1977.CrossRefGoogle Scholar
  2. [2]
    N. Baba, “Convergence of a Random Optimization Method for Constrained Optimization Problems”, JOTA, Vol. 33, pp.451 — pp.461, 1981.CrossRefGoogle Scholar
  3. [3]
    N. Baba, “A Hybrid Algorithm for Finding a Global Minimum”, Int. J. Control, Vol. 37–5, pp.929 — pp.942, 1983.Google Scholar
  4. [4]
    N. Baba, “A New Approach for Finding the Global Minimum of Error Function of Neural Networks”, Neural Networks, Vol. 2, pp.367 – pp.373, 1989.CrossRefGoogle Scholar
  5. [5]
    V. Cerny, “Thermodynamical Approach to the Travelling Salesman Problems and Efficient Simulation Algorithm”, JOTA, Vol. 45, pp.41 – pp.51, 1985.CrossRefGoogle Scholar
  6. [6]
    R. Fletcher and C.M. Reeves, “Function Minimization by Conjugate Gradients”, Computer Journal, Vol. 7, pp.149 – pp.154, 1964.CrossRefGoogle Scholar
  7. [7]
    S. Geman & C. Hwang, “Diffusions for Global Optimization”, SIAM J. Control and Optimization, Vol. 24, pp.1031 – pp.1043, 1983.CrossRefGoogle Scholar
  8. [8]
    S. Kirkpatrick, C.D. Gelatt, and M.P. Vecchi, “Optimization by Simulated Annealing”, IBM Thomas J. Watson Research Center Report, 1982.Google Scholar
  9. [9]
    D.G. Luenberger, Introduction to Linear & Nonlinear Programming, Addison-Wesley, 1973.Google Scholar
  10. [10]
    J. Matyas, “Random Optimization”, Automation & Remote Control, Vol. 26, pp.246 – pp.253, 1965.Google Scholar
  11. [11]
    J. Matyas, “Das Zufallig Optimierungs Verfahren und Seine Knnvergenz”, Proceedings of the 5th Analogue Computation Meeting, pp.540 – pp.544, 1968.Google Scholar
  12. [12]
    F.J. Solis & J.B. Wets, “Minimization by Random Search Techniques”, Mathematics of Operations Research, Vol. 6, pp.19 – pp.30, 1981.CrossRefGoogle Scholar
  13. [13]
    D.E. Rumelhart and J.L. McClelland, Editors, Parallel Distributed Processing, MIT Press, 1986.Google Scholar
  14. [14]
    Chua Publishing Company, Various Data on Stock Price in Japan, 1989 and 1990.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1992

Authors and Affiliations

  • Norio Baba
    • 1
  1. 1.Information ScienceOsaka Educational Univ.Ikeda City, 563Japan

Personalised recommendations