A New Evolutionary Algorithm with Deleting and Jumping Strategies for Global Optimization

  • Fei WeiEmail author
  • Shugang Li
  • Le Gao
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 81)


For global optimization problems with a large number of local optimal solutions, evolutionary algorithms are efficient parallel algorithms, but they drops into local optimum easily, therefore their efficiency and effectiveness will be much reduced. In this paper, first, a new deleting strategy is proposed that can eliminate all local optimal solutions no better than this obtained local optimal solution. Second, when algorithm drops into a local optimal solution, a new jumping strategy is proposed that can jump out of the current local optimal solution and then find a better local optimal solution. Based on the above, a new algorithm called evolutionary algorithm with deleting and jumping strategies (briefly, EADJ) is proposed, and the algorithm convergence is proved theoretically. The simulations are made on 25 standard benchmark problems, and the results indicate the proposed deleting strategy and jumping strategy are effective; further, the proposed algorithm is compared with some well performed existing algorithms, and the results indicate the proposed algorithm EADJ is more effective and efficient.


Evolutionary algorithm Global optimization Deleting strategy Jumping strategy 



This work was supported by the National Natural Science Foundation of China (No. U1404622) and the Cultivation Fund of Xi’an University of Science and Technology (No. 201644).


  1. 1.
    Ge, R.: A filled function method for finding a global minimizer of a function of several variables. Math. Program. 46, 191–204 (1990)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Lin, H.W., Wang, Y.P., Fan, L., Gao, Y.L.: A new discrete filled function method for finding global minimizer of the integer programming. Appl. Math. Comput. 219(9), 4371–4378 (2013)MathSciNetzbMATHGoogle Scholar
  3. 3.
    Branin Jr., F.H.: Widely convergent method for finding multiple solutions of simultaneous nonlinear equations. IBM J. Res. Dev. 16, 504–522 (1972)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Levy, A., Montalvo, A.: The tunneling algorithm for the global minimization of functions. SIAM J. Sci. Stat. Comput. 6, 15–29 (1985)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Bai, L., Liang, J., Dang, C., Cao, F.: A cluster centers initialization method for clustering categorical data. Expert Syst. Appl. 39, 8022–8029 (2012)CrossRefGoogle Scholar
  6. 6.
    Lin, H.W., Gao, Y.L., Wang, Y.P.: A continuously differentiable filled function method for global optimization. Numerical Algorithms 66(3), 511–523 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Dai, C., Wang, Y.P.: A new uniform evolutionary algorithm based on decomposition and CDAS for many-objective optimization. Knowl. Based Syst. 85, 131–142 (2015)CrossRefGoogle Scholar
  8. 8.
    Ren, A.H., Wang, Y.P.: Optimistic Stackelberg solutions to bilevel linear programming with fuzzy random variable coefficients. Knowl. Based Syst. 67, 206–217 (2014)CrossRefGoogle Scholar
  9. 9.
    Dang, C., Ma, W., Liang, J.: A deterministic annealing algorithm for approximating a solution of the min-bisection problem. Neural Netw. 22, 58–66 (2009)CrossRefzbMATHGoogle Scholar
  10. 10.
    Liang, J., Qin, A., Suganthan, P.N., Baskar, S.: Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 10, 281–295 (2006)CrossRefGoogle Scholar
  11. 11.
    Richter, H.: Evolutionary Algorithms and Chaotic Systems (2010). Springer-Verlag Berlin and Heidelberg GmbH & Co. KG, Heidelberg (2010). ISBN 9783642107061Google Scholar
  12. 12.
    Wang, Y., Dang, C.: An evolutionary algorithm for global optimization based on level-set evolution and latin squares. IEEE Trans. Evol. Comput. 11, 579–595 (2007)CrossRefGoogle Scholar
  13. 13.
    Yang, Z., Tang, K., Yao, X.: Self-adaptive differential evolution with neighborhood search. In: IEEE Congress on Evolutionary Computation, pp. 1110–1116 (2008)Google Scholar
  14. 14.
    Fang, K., Wang, Y.: Number-Theoretic Methods in Statistics. Chapman & Hall, London (1994)CrossRefzbMATHGoogle Scholar
  15. 15.
    Suganthan, P.N., Hansen, N., Liang, J.J., Deb, K., Chen, Y.P., Auger, A., Tiwari, S: Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Technical report, Nanyang Technological University, Singapore (2005)Google Scholar
  16. 16.
    Yang, Z., Yao, X., He, J.: Making a difference to differential evolution. In: Siarry, P., Michalewicz, Z. (eds.) Advances in Metaheuristics for Hard Optimization, pp. 397–414. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  17. 17.
    Ronkkonen, J., Kukkonen, S., Price, K.V.: Real-parameter optimization with differential evolution. In: IEEE Congress on Evolutionary Computation, vol. 1, pp. 506–513 (2005)Google Scholar
  18. 18.
    Auger, A., Hansen, N.: Performance evaluation of an advanced local search evolutionary algorithm. In: IEEE Congress on Evolutionary Computation, vol. 2, pp. 1777–1784 (2005)Google Scholar
  19. 19.
    Nikolaus, H.: Compilation of results on the CEC benchmark function set (2005).

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.College of SciencesXi’an University of Science and TechnologyXi’anChina
  2. 2.College of Safety Science and EngineeringXi’an University of Science and TechnologyXi’anChina

Personalised recommendations