Advertisement

A Comparison Study of Surrogate Model Based Preselection in Evolutionary Optimization

  • Hao Hao
  • Jinyuan Zhang
  • Aimin Zhou
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10955)

Abstract

In evolutionary optimization, the purpose of preselection is to identify some promising solutions in a set of candidate offspring solutions. The surrogate model is a popular method employed in preselection. A surrogate model is built to approximate the original objective function and to estimate the fitness values of the candidate solutions. Based on the estimated fitness values, the promising solutions can be identified. This paper aims to study and compare the surrogate model based preselection strategies in evolutionary algorithms. Systematic experiments are conducted to study the performance of four surrogate models. The experimental results suggest the surrogate model based preselection can significantly improve the performance of evolutionary algorithms.

Keywords

Surrogate model Preselection Evolutionary algorithm 

Notes

Acknowledgement

This work is supported by the National Natural Science Foundation of China under Grant No. 61731009, 61673180, and 61703382.

References

  1. 1.
    Polak, E.: Optimization: Algorithms and Consistent Approximations. Springer, New York (1997).  https://doi.org/10.1007/978-1-4612-0663-7CrossRefzbMATHGoogle Scholar
  2. 2.
    Lu, X.-F., Tang, K., Sendhoff, B., Yao, X.: A new self-adaptation scheme for differential evolution. Neurocomputing 146(C), 2–16 (2014)CrossRefGoogle Scholar
  3. 3.
    Mallipeddi, R., Suganthan, P.N.: Unit commitment - a survey and comparison of conventional and nature inspired algorithms. Int. J. Bio-Inspir. Comput. 6(2), 71–90 (2014)CrossRefGoogle Scholar
  4. 4.
    Back, T., Schwefel, H.-P.: Evolutionary computation: an overview. In: 1996 IEEE International Congress on Evolutionary Computation (CEC), pp. 20–29 (1996)Google Scholar
  5. 5.
    Back, T.: Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming. Genetic Algorithms. Oxford University Press, New York (1996)zbMATHGoogle Scholar
  6. 6.
    Cavicchio, D.J.: Adaptive search using simulated evolution. Unpublished doctoral dissertation, University of Michigan, Ann Arbor (1970)Google Scholar
  7. 7.
    Mahfoud, S.W.: Crowding and preselection revisited. In: Parallel Problem Solving from Nature (PPSN), pp. 27–36. Amsterdam Press, North-Holland (1992)Google Scholar
  8. 8.
    Wang, Y., Cai, Z., Zhang, Q.: Differential evolution with composite trial vector generation strategies and control parameters. IEEE Trans. Evol. Comput. 15, 55–66 (2011)CrossRefGoogle Scholar
  9. 9.
    Li, Y., Zhou, A., Zhang, G.: An MOEA/D with multiple differential evolution mutation operators. In: 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 397–404 (2014)Google Scholar
  10. 10.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft. Comput. 9(1), 3–12 (2003)CrossRefGoogle Scholar
  11. 11.
    Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011)CrossRefGoogle Scholar
  12. 12.
    Zhang, J., Zhou, A., Zhang, G.: Preselection via classification: a case study on global optimization. Int. J. Bio-Inspir. Comput. (2018, accepted)Google Scholar
  13. 13.
    Lu, X., Tang, K., Yao, X.: Classification-assisted differential evolution for computationally expensive problems. In: 2011 IEEE Congress on Evolutionary Computation (CEC), pp. 1986–1993 (2011)Google Scholar
  14. 14.
    Emmerich, M., Giotis, A., Özdemir, M., Bäck, T., Giannakoglou, K.: Metamodel—assisted evolution strategies. In: Guervós, J.J.M., Adamidis, P., Beyer, H.-G., Schwefel, H.-P., Fernández-Villacañas, J.-L. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 361–370. Springer, Heidelberg (2002).  https://doi.org/10.1007/3-540-45712-7_35CrossRefGoogle Scholar
  15. 15.
    El-beltagy, M.A., Keane, A.J.: Evolutionary optimization for computationally expensive problems using Gaussian processes. In: Arabnia, H. (ed.) Proceedings of International Conference on Artificial Intelligence IC-AI’2001. CSREA Press (2001)Google Scholar
  16. 16.
    Sun, C., Ding, J., Zeng, V., Jin, Y.: A fitness approximation assisted competitive swarm optimizer for large scale expensive optimization problems. Memetic Comput. 10(2), 123–134 (2018) CrossRefGoogle Scholar
  17. 17.
    Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002)CrossRefGoogle Scholar
  18. 18.
    Tenne, Y., Armfield, S.W.: A framework for memetic optimization using variable global and local surrogate models. Soft. Comput. 13(8), 781–793 (2009)CrossRefGoogle Scholar
  19. 19.
    Tabatabaei, M., Hakanen, J., Hartikainen, M., Miettinen, K., Sindhya, K.: A survey on handling computationally expensive multiobjective optimization problems using surrogates: non-nature inspired methods. Struct. Multidiscip. Opt. 52(1), 1–24 (2015)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)zbMATHGoogle Scholar
  21. 21.
    Breiman, L., Friedman, J., Olshen, R., Stone, C.J.: Classification and regression trees. Biometrics 40(3), 17–23 (1984)zbMATHGoogle Scholar
  22. 22.
    Liu, B., Zhang, Q., Gielen, G.G.E.: A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems. IEEE Trans. Evol. Comput. 18(2), 180–192 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Shanghai Key Laboratory of Multidimensional Information Processing, Department of Computer Science and TechnologyEast China Normal UniversityShanghaiChina

Personalised recommendations