Advertisement

Modification of the Particle Swarm Optimizer for Locating All the Global Minima

  • K. E. Parsopoulos
  • M. N. Vrahatis

Abstract

In many optimization applications, escaping from the local minima as well as computing all the global minima of an objective function is of vital importance. In this paper the Particle Swarm Optimization method is modified in order to locate and evaluate all the global minima of an objective function. The new approach separates the swarm properly when a candidate minimizer is detected. This technique can also be used for escaping from the local minima which is very important in neural network training.

Keywords

Particle Swarm Optimization Local Search Global Minimizer Inertia Weight Neural Network Training 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    R.C. Eberhart and Y.H. Shi, “Evolving Artificial Neural Networks”, Proc. Int. Conf. on N.N. and Brain, Beijing, P.R. China, 1998.Google Scholar
  2. [2]
    R.C. Eberhart, P.K. Simpson and R.W. Dobbins, “Computational Intelligence PC Tools”, Academic Press Professional, Boston, 1996.Google Scholar
  3. [3]
    R. Horst, P.M. Pardalos and N.V. Thoai, “Introduction to Global Optimization”, Kluwer Academic Publishers, 1995.Google Scholar
  4. [4]
    J. Kennedy and R.C. Eberhart, “Partide Swarm Optimization”, Proc. IEEE Int. Conf. on N.N., Piscataway, NJ, pp. 1942–1948, 1995.Google Scholar
  5. [5]
    G.D. Magoulas, M.N. Vrahatis and G.S. Androulakis, “Effective back-propagation with variable stepsize”, Neural Networks, vol. 10, pp. 69–82, 1997.CrossRefGoogle Scholar
  6. [6]
    Z. Michalewicz, “Genetic Algorithms + Data Structures = Evolution Programs”, Springer, New York, 1996.CrossRefMATHGoogle Scholar
  7. [7]
    K.P. Parsopoulos, V.P. Plagianakos, G.D. Magoulas and M.N. Vrahatis, “Objective function stretching” to alleviate convergence to local minima”, Nonlinear Analysis, T.M.A., 2001, to appear.Google Scholar
  8. [8]
    W.H. Press, W.T. Vetterling, S.A. Teukolsky and B.P. Flannery, “Numerical Recipes in Fortran 77”, Cambridge University Press, 1992.Google Scholar
  9. [9]
    M.N. Vrahatis, G.S. Androulakis, J.N. Lambrinos and G.D. Magoulas, “A class of gradient unconstrained minimization algorithms with adaptive stepsize”, J. of Camp. and App. Math., vol. 114, pp. 367–386, 2000.MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Wien 2001

Authors and Affiliations

  • K. E. Parsopoulos
  • M. N. Vrahatis
    • 1
  1. 1.Department of MathematicsUniversity of Patras Artificial Intelligence Research Center (UPAIRC), University of PatrasPatrasGreece

Personalised recommendations