Advertisement

Differential Evolution Algorithms Used to Optimize Weights of Neural Network Solving Pole-Balancing Problem

  • Jan VargovskyEmail author
  • Lenka Skanderova
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 554)

Abstract

Differential evolution (DE) has been successfully used to solve difficult optimization problems. Every year, novel DE algorithms are developed to outperform the previous versions. The JADE is a famous DE algorithm using a mutation strategy current-to-pbest and the adaptation of control parameters. The SHADE has been developed to eliminate some bottlenecks of the JADE, especially its tendency to a premature convergence. The performance of these algorithms has been demonstrated on various benchmarks. The goal of this work is to compare the performance of the selected DE algorithms which are used to optimize the weights of the artificial neural network solving the pole-balancing problem.

Keywords

Differential evolution JADE SHADE Artificial neural network Pole-balancing problem 

Notes

Acknowledgement

The following grants are acknowledged for the financial support provided for this research by Grant of SGS No. 2018/177, VSB - Technical University of Ostrava and under the support of NAVY and MERLIN research lab.

References

  1. 1.
    Brest, J., Greiner, S., Bošković, B., Mernik, M., Zumer, V.: Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems. IEEE Trans. Evol. Comput. 10(6), 646–657 (2006)CrossRefGoogle Scholar
  2. 2.
    Brownlee, J., et al.: The pole balancing problem: A Benchmark Control Theory Problem (2005)Google Scholar
  3. 3.
    Chiang, C.-W., Lee, W.-P., Heh, J.-S.: A 2-opt based differential evolution for global optimization. Appl. Soft Comput. 10(4), 1200–1207 (2010)CrossRefGoogle Scholar
  4. 4.
    Ilonen, J., Kamarainen, J.-K., Lampinen, J.: Differential evolution training algorithm for feed-forward neural networks. Neural Process. Lett. 17(1), 93–105 (2003)CrossRefGoogle Scholar
  5. 5.
    Liang, J., Qu, B., Suganthan, P., Hernández-Díaz, A.G.: Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization. In: Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, Singapore, Technical Report 201212, 3–18 (2013)Google Scholar
  6. 6.
    Mallipeddi, R., Suganthan, P.N., Pan, Q.-K., Tasgetiren, M.F.: Differential evolution algorithm with ensemble of parameters and mutation strategies. Appl. Soft Comput. 11(2), 1679–1696 (2011)CrossRefGoogle Scholar
  7. 7.
    Real, E., Moore, S., Selle, A., Saxena, S., Suematsu, Y.L., Tan, J., Le, Q., Kurakin, A.: Large-scale evolution of image classifiers. arXiv preprint. arXiv:1703.01041 (2017)
  8. 8.
    Slowik, A., Bialko, M.: Training of artificial neural networks using differential evolution algorithm. In: 2008 Conference on Human System Interactions, pp. 60–65. IEEE (2008)Google Scholar
  9. 9.
    Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)CrossRefGoogle Scholar
  10. 10.
    Storn, R., Price, K.: Differential evolution-a simple and efficient adaptive scheme for global optimization over continuous spaces, vol. 3. ICSI, Berkeley (1995)Google Scholar
  11. 11.
    Suganthan, P.N., Hansen, N., Liang, J.J., Deb, K., Chen, Y.-P., Auger, A., Tiwari, S.: Problem definitions and evaluation criteria for the cec 2005 special session on real-parameter optimization. KanGAL report 2005005, 2005 (2005)Google Scholar
  12. 12.
    Tanabe, R., Fukunaga, A.: Success-history based parameter adaptation for differential evolution. In: 2013 IEEE Congress on Evolutionary Computation (CEC), pp. 71–78. IEEE (2013)Google Scholar
  13. 13.
    Tanabe, R., Fukunaga, A.S.: Improving the search performance of shade using linear population size reduction. In: 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 1658–1665. IEEE (2014)Google Scholar
  14. 14.
    Wang, Y., Cai, Z., Zhang, Q.: Differential evolution with composite trial vector generation strategies and control parameters. IEEE Trans. Evol. Comput. 15(1), 55–66 (2011)CrossRefGoogle Scholar
  15. 15.
    Yao, X., Liu, Y., Lin, G.: Evolutionary programming made faster. IEEE Trans. Evol. Comput. 3(2), 82–102 (1999)CrossRefGoogle Scholar
  16. 16.
    Yi, W., Gao, L., Li, X., Zhou, Y.: A new differential evolution algorithm with a hybrid mutation operator and self-adapting control parameters for global optimization problems. Appl. Intell. 42(4), 642–660 (2015)CrossRefGoogle Scholar
  17. 17.
    Zhang, J., Sanderson, A.C.: JADE: adaptive differential evolution with optional external archive. IEEE Trans. Evol. Comput. 13(5), 945–958 (2009)CrossRefGoogle Scholar
  18. 18.
    Zhou, Y., Li, X., Gao, L.: A differential evolution algorithm with intersect mutation operator. Appl. Soft Comput. 13(1), 390–401 (2013)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Faculty of Electrical Engineering and Computer ScienceVSB – Technical University of OstravaOstravaCzech Republic

Personalised recommendations