Proper Choice of Control Parameters for CoDE Algorithm

  • Petr BujokEmail author
  • Daniela Einšpiglová
  • Hana Zámečníková
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 991)


An adaptive variant of CoDE algorithm uses three couples of settings of two control parameters. These combinations provide well performance when solving a various type of optimisation problems. The aim of the paper is to replace original values of control parameters in CoDE to achieve better efficiency in real-world problems. Two different variants of enhanced CoDE algorithm are proposed and compared with the original CoDE variant. The new combinations of F and CR parameters are selected from results provided in a preliminary study where 441 various combinations of these parameters were evaluated. The results show that newly proposed CoDE variants (CoDE\(_\mathrm {FCR1}\) and CoDE\(_\mathrm {FCR2}\)) perform better than the original CoDE in most of 22 real-world problems.


Global optimisation Differential evolution Control parameters CoDE Real-world problems Experimental comparison 


  1. 1.
    Brest, J., Maučec, M.S., Bošković, B.: Single objective real-parameter optimization: algorithm jSO. In: 2017 IEEE Congress on Evolutionary Computation (CEC), pp. 1311–1318 (2017)Google Scholar
  2. 2.
    Bujok, P.: Migration model of adaptive differential evolution applied to real-world problems. Artificial Intelligence and Soft Computing–Part I. Lecture Notes in Computer Science, vol. 10841, pp. 313–322. In: 17th International Conference on Artificial Intelligence and Soft Computing ICAISC. Zakopane, Poland (2018)CrossRefGoogle Scholar
  3. 3.
    Bujok, P., Tvrdík, J., Poláková, R.: Differential evolution with exponential crossover revisited. In: Matoušek, R. (ed.) MENDEL, 22nd International Conference on Soft Computing, pp. 17–24. Czech Republic, Brno (2016)Google Scholar
  4. 4.
    Das, S., Mullick, S.S., Suganthan, P.N.: Recent advances in differential evolution-an updated survey. Swarm Evol. Comput. 27, 1–30 (2016)CrossRefGoogle Scholar
  5. 5.
    Das, S., Suganthan, P.N.: Problem Definitions and Evaluation Criteria for CEC 2011 Competition on Testing Evolutionary Algorithms on Real World Optimization Problems. Tech. rep. Jadavpur University, India and Nanyang Technological University, Singapore (2010)Google Scholar
  6. 6.
    Das, S., Suganthan, P.N.: Differential evolution: a survey of the state-of-the-art. IEEE Trans. Evol. Comput. 15, 27–54 (2011)Google Scholar
  7. 7.
    Feoktistov, V.: Differential Evolution in Search of Sotution. Springer (2006)Google Scholar
  8. 8.
    Neri, F., Tirronen, V.: Recent advances in differential evolution: a survey and experimental analysis. Artif. Intell. Rev. 33, 61–106 (2010)CrossRefGoogle Scholar
  9. 9.
    Price, K.V., Storn, R., Lampinen, J.: Differential Evolution: A Practical Approach to Global Optimization. Springer (2005)Google Scholar
  10. 10.
    Storn, R., Price, K.V.: Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11, 341–359 (1997)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Tang, L., Dong, Y., Liu, J.: Differential evolution with an individual-dependent mechanism. IEEE Trans. Evol. Comput. 19(4), 560–574 (2015)CrossRefGoogle Scholar
  12. 12.
    Tvrdík, J.: Competitive differential evolution. In: Matoušek, R., Ošmera, P. (eds.) MENDEL 2006, 12th International Conference on Soft Computing, pp. 7–12. University of Technology, Brno (2006)Google Scholar
  13. 13.
    Tvrdík, J., Poláková, R., Veselský, J., Bujok, P.: Adaptive variants of differential evolution: towards control-parameter-free optimizers. In: Zelinka, I., Snášel, V., Abraham, A. (eds.) Handbook of Optimization–From Classical to Modern Approach. Intellingent Systems Reference Library, vol. 38, pp. 423–449. Springer, Berlin Heidelberg (2012)CrossRefGoogle Scholar
  14. 14.
    Wang, Y., Cai, Z., Zhang, Q.: Differential evolution with composite trial vector generation strategies and control parameters. IEEE Trans. Evol. Comput. 15, 55–66 (2011)CrossRefGoogle Scholar
  15. 15.
    Wang, Y., Li, H.X., Huang, T., Li, L.: Differential evolution based on covariance matrix learning and bimodal distribution parameter setting. Appl. Soft Comput. 18, 232–247 (2014)CrossRefGoogle Scholar
  16. 16.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1, 67–82 (1997)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.University of OstravaOstravaCzech Republic

Personalised recommendations