Dependency of GPA-ES Algorithm Efficiency on ES Parameters Optimization Strength

  • Tomas BrandejskyEmail author
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 554)


In this work, the relation between number of ES iterations and convergence of the whole GPA-ES hybrid algorithm will be studied due to increasing needs to analyze and model large data sets. Evolutionary algorithms are applicable in the areas which are not covered by neural networks and deep learning like search of algebraic model of data. The difference between time and algorithmic complexity will be also mentioned as well as the problems of multitasking implementation of GPA, where external influences complicate increasing of GPA efficiency via Pseudo Random Number Generator (PRNG) choice optimization.

Hybrid evolutionary algorithms like GPA-ES uses GPA for solution structure development and Evolutionary Strategy (ES) for parameters identification are controlled by many parameters. The most significant are sizes of GPA population and sizes of ES populations related to each particular individual in GPA population. There is also limit of ES algorithm evolutionary cycles. This limit plays two contradictory roles. On one side bigger number of ES iterations means less chance to omit good solution for wrongly identified parameters, on the opposite side large number of ES iterations significantly increases computational time and thus limits application domain of GPA-ES algorithm.


Genetic Programming Algorithm Evolutionary Strategy Hybrid Evolutionary System Algorithm efficiency Optimization 



Access to computing and storage facilities owned by parties and projects contributing to the National Grid Infrastructure MetaCentrum provided under the programme “Projects of Large Research, Development, and Innovations Infrastructures” (CESNET LM2015042), is greatly appreciated.


  1. 1.
    Brandejsky, T.: Evolutionary system to model structure and parameters regression. Neural Netw. World 12(2), 181–194 (2012). ISSN 1210-0552CrossRefGoogle Scholar
  2. 2.
    Brandejsky, T.: The use of local models optimized by genetic programming algorithm in biomedical-signal analysis. In: Zelinka, I., Snasel, V., Abraham, A. (eds.) Handbook of Optimization from Classical to Modern Approach, pp. 697–716 (2012). ISSN 1868-4394, ISBN 978-3-642-30503-0Google Scholar
  3. 3.
    Alander, T.: On optimal population size of genetic algorithms. In: Proceedings of the IEEE Computer Systems and Software Engineering, pp. 65–69 (1992)Google Scholar
  4. 4.
    Eiben, A.E., Hinterding, R., Michalewic, Z.: Parameter control in evolutionary algorithms. Trans. Evol. Comput. 3(2), 124–141 (1999). Scholar
  5. 5.
    Koumousis, K., Katsaras, C.P.: A saw-tooth genetic algorithm combining the effects of variable population size and reinitialization to enhance performance. IEEE Trans. Evol. Comput. 10(1), 19–28 (2006). Scholar
  6. 6.
    Lobo, G., Lima, C.F., Michalewicz, Z. (eds.): Parameter Setting in Evolutionary Algorithms. Studies in Computational Intelligence, vol. 54. Springer, Heidelberg (2007). ISBN 978-3-540-69431-1zbMATHGoogle Scholar
  7. 7.
    Reeves, C.R.: Using genetic algorithms with small populations. In: Proceedings of the Fifth International Conference on Genetic Algorithms, San Mateo, pp. 92–99 (1993). ISBN 1-55860-299-2Google Scholar
  8. 8.
    Piszcz, A., Soul, T.: Genetic programming: optimal population sizes for varying complexity problems. In: Proceedings of the Genetic and Evolutionary Computation Conference GECCO, Seattle, pp. 953–954 (2006).
  9. 9.
    Brandejsky, T.: Small populations in GPA-ES algorithm. In: Matousek, R., (ed.) 19th International Conference on Soft Computing, MENDEL 2013, Brno, pp. 31–36 (2013). ISSN 1803-3814, ISBN 978-80-214-4755-4Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.University of PardubicePardubiceCzech Republic

Personalised recommendations