Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

A hybrid memory-based dragonfly algorithm with differential evolution for engineering application


The dragonfly algorithm (DA) is a swarm-based stochastic algorithm which possesses static and dynamic behavior of swarm and is gaining meaningful popularity due to its low computational cost and fast convergence in solving complex optimization problems. However, it lacks internal memory and is thereby not able to keep track of its best solutions in previous generations. Furthermore, the solution also lacks in diversity and thereby has a propensity of getting trapped in the local optimal solution. In this paper, an iterative-level hybridization of dragonfly algorithm (DA) with differential evolution (DE) is proposed and named as hybrid memory-based dragonfly algorithm with differential evolution (DADE). The reason behind selecting DE is for its computational ability, fast convergence and capability in exploring the solution space through the use of crossover and mutation techniques. Unlike DA, in DADE the best solution in a particular iteration is stored in memory and proceeded with DE which enhances population diversity with improved mutation and accordingly increases the probability of reaching global optima efficiently. The efficiency of the proposed algorithm is measured based on its response to standard set of 74 benchmark functions including 23 standard mathematical benchmark functions, 6 composite benchmark function of CEC2005, 15 benchmark functions of CEC2015 and 30 benchmark function of CEC2017. The DADE algorithm is applied to engineering design problems such as welded beam deign, pressure vessel design, and tension/compression spring design. The algorithm is also applied to the emerging problem of secondary user throughput maximization in an energy-harvesting cognitive radio network. A comparative performance analysis between DADE and other most popular state-of-the-art optimization algorithms is carried out and significance of the results is deliberated. The result demonstrates significant improvement and prominent advantages of DADE compared to conventional DE, PSO and DA in terms of various performance measuring parameters. The results of the DADE algorithm applied on some important engineering design problems are encouraging and validate its appropriateness in the context of solving interesting practical engineering challenges. Lastly, the statistical analysis of the algorithm is also performed and is compared with other powerful optimization algorithms to establish its superiority.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21


  1. 1.

    Dorigo M, Thomas S (2004) Ant colony optimization. MIT Press eBooks, Cambridge

  2. 2.

    Blum C, Li X (2008) Swarm intelligence in optimization. In: Blum C, Merkle D (eds) Swarm intelligence: Introduction and applications. Springer Berlin Heidelberg, Berlin, pp 43–85

  3. 3.

    Kennedy J, Eberhart R (1995) Particle swarm optimization. Proc IEEE Int Conf Perth 4:1942–1948.

  4. 4.

    Omran M, Engelbrecht AP, Salman A (2005) Particle swarm optimization methods for image clustering. Int J Pattern Recognit Artif Intell 19(3):297–321.

  5. 5.

    AlRashidi MR, El-Hawary ME (2008) A survey of particle swarm optimization applications in electric power systems. IEEE Trans Evol Comput 13(4):913–918.

  6. 6.

    He S, Prempain E, Wu QH (2007) An improved particle swarm optimizer for mechanical design optimization problems. Eng Optim 36(5):585–605.

  7. 7.

    Nimtawat A, Nanakorn P (2011) Simple particle swarm optimization for solving beam-slab layout design problems. Proc Eng 14:1392–1398.

  8. 8.

    Dorigo M (2007) Ant colony optimization, IRIDAI. Schol Pedia 2(3):1461.

  9. 9.

    Akay B, Karaboga D (2012) Artificial bee colonial algorithm for large-scale problems and engineering design optimization. J Intell Manuf 23(4):1001–1014.

  10. 10.

    Yang XS (2010) A new metaheuristic bat-inspired algorithm. In: González JR, Pelta DA, Cruz C, Terrazas G, Rasnogor N (eds) Nature inspired cooperative strategies for optimization (NICSO 2010). Springer Berlin Heidelberg, Berlin, pp 65–74

  11. 11.

    Yang XS (2009) Firefly algorithms for multimodal optimization. In: Watanabe O, Zeugmann T (eds), Stochastic algorithms: foundations and applications: 5th international symposium, SAGA 2009, Sapporo, Japan, pp. 169–178. Berlin, Heidelberg: Springer Berlin Heidelberg.

  12. 12.

    Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67.

  13. 13.

    Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61.

  14. 14.

    Sharma A, Sharma A, Panigrahi B, Kiran D, Kumar R (2016) Ageist spider monkey optimization algorithm. Swarm Evol Comput 28:58–77.

  15. 15.

    Wang GG (2016) Moth search algorithm: a bio-inspired metaheuristic algorithm for global optimization problems. Memet Comput.

  16. 16.

    Mirjalili S (2015) Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl Based Syst 89:228–249.

  17. 17.

    Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83:80–98.

  18. 18.

    Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, faris H, Mirjalili SM, (2017) salp swarm algorithm. Adv Eng Softw 114:163–191.

  19. 19.

    Mirjalili S (2016) Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27(4):1053–1073.

  20. 20.

    Saremi S, Mirjalili S, Lewis A (2017) Grasshopper Optimisation Algorithm: Theory and application. Adv Eng Softw 105:30–47.

  21. 21.

    Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput & Applic 27:495–513.

  22. 22.

    Mirjalili S (2016) SCA: A sine cosine algorithm for solving optimization problems. Knowl-Based Syst 96:120–133.

  23. 23.

    Thangaraj R, Pant M, Abraham A, Bouvry P (2011) Particle swarm optimization: Hybridization perspectives and experimental illustrations. Appl Math Comput 217(12):5208–5226.

  24. 24.

    Chen WN, Lin Y, Chen N, Zhan ZH, Chung HSH, Li Y, Shi YH (2013) Particle swarm optimization with an aging leader and challengers. IEEE Trans Evol Comput 17(2):241–258.

  25. 25.

    Zhang WJ, Xie XF (2003) DEPSO: hybrid particle swarm with differential evolution operator. IEEE Int Conf Syst Man and Cybern.

  26. 26.

    Storn R, Price K (1997) Differential evolution—a simple and efficient Heuristic for global optimization over continuous spaces. J Glob Optim 11:341–359.

  27. 27.

    Liu H, Xu G, Ding GY, Sun YB (2014) Human behavior based particle swarm optimization, the scientific world journal. Hindawi Publishing Corporation, London

  28. 28.

    Şenel FA, Gökçe F, Yüksel AS, Yiğit T (2018) A novel hybrid PSO–GWO algorithm for optimization problems. Eng Comput.

  29. 29.

    Singh N, Chiclana F, Magnot JP (2019) A new fusion of salp swarm with sine cosine for optimization of non-linear functions. Eng Comput.

  30. 30.

    Elsayed SM, Sarker RA, Essam DL (2012) An improved self-adaptive differential evolution algorithm for optimization problems. IEEE Trans Ind Inf 9(1):89–99.

  31. 31.

    Laizhong C, Genghui L, Zexuan Z, Zhong M, Zhenkun W, Nan L (2019) Differential evolution algorithm with dichotomy-based parameter space compression. Soft Comput 23:3643.

  32. 32.

    Bin X, Lili T, Xu C, Wushan C (2019) Adaptive differential evolution with multi-population-based mutation operators for constrained optimizatio. Soft Comput 23: 3423.

  33. 33.

    Pavai G, Geetha TV (2019) New crossover operators using dominance and co-dominance principles for faster convergence of genetic algorithms. Soft Comput 23: 3661.

  34. 34.

    Yang XS (2014) Chapter 2—analysis of algorithms. In: Yang XS (ed) Nature-inspired optimization algorithms. Elsevier, Oxford, pp 23–44

  35. 35.

    Sree Ranjini KS, Murugan S (2017) Memory based hybrid Dragonfly algorithm for numerical optimization problem. Expart Syst Appl 83:63–78.

  36. 36.

    Liu H, Abraham A, Zhang W (2012) A fuzzy adaptive turbulent particle swarm optimization. Int J Innov Comput Appl 1(1):39–47.

  37. 37.

    Chen YP, Peng WC, Jian MC (2007) Particle swarm optimization with recombination and dynamic linkage discovery. IEEE Trans Syst Man Cybern B 37(6):1460–1470.

  38. 38.

    Andrews PS (2006) An investigation into mutation operators for particle swarm optimization. In: Proc. IEEE Congress on Evolutionary Computation (CEC), pp. 1044–1051.

  39. 39.

    Angeline PJ (1998) Using selection to improve particle swarm optimization. In: Proc. IEEE Congress on Evolutionary Computation (CEC), pp. 84–89,

  40. 40.

    Rao SS (2009) Chapter 8, engineering optimization—theory and practice, 4th edn. Wiley, Hoboken

  41. 41.

    Goldberg DE (1989) Genetic algorithms in search optimization and machine learning. Addison Wesley Publishing Company, Boston

  42. 42.

    Deb K (1999) An introduction to genetic algorithms. Sadhana 24:293–315.

  43. 43.

    Kubota N, Fukuda T (1997) Genetic algorithms with age structure. Soft Comput 1(4):155–161.

  44. 44.

    Ghosh A, Tsutsui S, Tanaka H (1996) Individual aging in genetic algorithms,” in Proc. Conference on Intelligent Information Systems, Australian New Zealand, pp. 276–279.

  45. 45.

    Goldsmith TC (2004) Aging as an evolved characteristic Weismann’s theory reconsidered. Med Hypotheses Version 62(2):304–308.

  46. 46.

    Goldsmith TC (2006) The evolution of aging. Azinet Press, Crownsville

  47. 47.

    Gavrilov LA, Gavrilova NS (2002) Evolutionary theories of aging and longevity. Sci World J 2:339–356.

  48. 48.

    Liang JJ, Suganthan PN, Deb K (2005) Novel composition test functions for numerical global optimization, Proceedings 2005 IEEE Swarm Intelligence Symposium, SIS 2005. Pasadena, CA, USA 2005:68–75.

  49. 49.

    Liang JJ, Qu BY, Suganthan PN, Chen Q (2014) Problem definitions and evaluation criteria for the CEC 2015 competition on learning-based real-parameter single objective optimization. Technical Report, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Technical Report, Nanyang Technological University, Singapore

  50. 50.

    Awad NH, Ali MZ, Suganthan PN, Liang JJ, Qu BY (2016) Problem definitions and evaluation criteria for the CEC 2017 special session and competition on single objective real-parameter numerical optimization. Technical Report, Nanyang Technological University, Singapore

  51. 51.

    Hansen N, Müller SD, Koumoutsakos P (2003) Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol Comput 11(1):1–18.

  52. 52.

    Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

  53. 53.

    Derrac J, Garcia S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18.

  54. 54.

    Chen D, Zou F, Lu R, Wang P (2017) Learning backtracking search optimisation algorithm and its application. Inf Sci 376:71–94.

  55. 55.

    Kannan BK, Kramer SN (1994) An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. J Mech Des 116(2):405–411.

  56. 56.

    Arora J, Arora J (2011) Introduction to optimum design. McGraw-Hill, New York

  57. 57.

    Bhowmick A, Yadav K, Roy SD, Kundu S (2017) Throughput of an energy harvesting cognitive radio network based on prediction of primary user. IEEE Trans Veh Technol 66(9):8119–8128.

Download references


This work is supported by Ministry of Electronics and Information Technology, Govt. of India (Reference Grant No.: 21(1)/2015-CC&BT).


This work is funded by Ministry of Electronics and Information Technology, Govt. of India (Grant No.: 21(1)/2015-CC&BT.)

Author information

Correspondence to Sanjoy Debnath.

Ethics declarations

Conflict of interest

The authors declare no conflicts of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Debnath, S., Baishya, S., Sen, D. et al. A hybrid memory-based dragonfly algorithm with differential evolution for engineering application. Engineering with Computers (2020).

Download citation


  • Optimization
  • Evolutionary algorithms
  • Differential evolution
  • Dragonfly algorithm
  • Hybridization