Skip to main content

A Study of the Combination of Variation Operators in the NSGA-II Algorithm

  • Conference paper
Advances in Artificial Intelligence (CAEPIA 2013)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8109))

Included in the following conference series:

Abstract

Multi-objective evolutionary algorithms rely on the use of variation operators as their basic mechanism to carry out the evolutionary process. These operators are usually fixed and applied in the same way during algorithm execution, e.g., the mutation probability in genetic algorithms. This paper analyses whether a more dynamic approach combining different operators with variable application rate along the search process allows to improve the static classical behavior. This way, we explore the combined use of three different operators (simulated binary crossover, differential evolution’s operator, and polynomial mutation) in the NSGA-II algorithm. We have considered two strategies for selecting the operators: random and adaptive. The resulting variants have been tested on a set of 19 complex problems, and our results indicate that both schemes significantly improve the performance of the original NSGA-II algorithm, achieving the random and adaptive variants the best overall results in the bi- and three-objective considered problems, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Blum, C., Roli, A.: Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM Computing Surveys 35(3), 268–308 (2003)

    Article  Google Scholar 

  2. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE TEVC 6(2), 182–197 (2002)

    Google Scholar 

  3. Deb, K., Sinha, A., Kukkonen, S.: Multi-objective test problems, linkages, and evolutionary methodologies. In: GECCO 2006, pp. 1141–1148 (2006)

    Google Scholar 

  4. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  5. Huang, V.L., Qin, A.K., Suganthan, P.N., Tasgetiren, M.F.: Multi-objective optimization based on self-adaptive differential evolution algorithm. In: Proceedings of the 2007 IEEE CEC, pp. 3601–3608 (2007)

    Google Scholar 

  6. Huang, V.L., Zhao, S.Z., Mallipeddi, R., Suganthan, P.N.: Multi-objective optimization using self-adaptive differential evolution algorithm. In: Proceedings of the 2009 IEEE CEC, pp. 190–194 (2009)

    Google Scholar 

  7. Iorio, A.W., Li, X.: Solving rotated multi-objective optimization problems using differential evolution. In: Australian Conference on Artificial Intelligence, pp. 861–872 (2004)

    Google Scholar 

  8. Knowles, J., Thiele, L., Zitzler, E.: A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers. Technical Report 214, Computer Engineering and Networks Laboratory (TIK), ETH Zurich (2006)

    Google Scholar 

  9. Li, H., Zhang, Q.: Multiobjective optimization problems with complicated pareto sets, MOEA/D and NSGA-II. IEEE TEVC 2(12), 284–302 (2009)

    Google Scholar 

  10. Toscano Pulido, G., Coello Coello, C.A.: The micro genetic algorithm 2: Towards online adaptation in evolutionary multiobjective optimization. In: Fonseca, C.M., Fleming, P.J., Zitzler, E., Deb, K., Thiele, L. (eds.) EMO 2003. LNCS, vol. 2632, pp. 252–266. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  11. Vrugt, J.A., Robinson, B.A.: Improved evolutionary optimization from genetically adaptive multimethod search. Proceedings of the National Academy of Sciences of the United States of America 104(3), 708–711 (2007)

    Article  Google Scholar 

  12. Zhang, Q., Suganthan, P.N.: Special session on performance assessment of multiobjective optimization algorithms/cec 09 moea competition (May 2009)

    Google Scholar 

  13. Zhang, Q., Zou, A., Zhao, S., Suganthan, P.N., Liu, W., Tivari, S.: Multiobjective optimization test instances for the cec 2009 special session and competition. Technical Report CES-491, School of CS & EE, University of Essex (April 2009)

    Google Scholar 

  14. Zitzler, E., Thiele, L.: Multiobjective evolutionary algorithms: a comparative case study and the strength pareto approach. IEEE TEVC 3(4), 257–271 (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Nebro, A.J., Durillo, J.J., Machín, M., Coello Coello, C.A., Dorronsoro, B. (2013). A Study of the Combination of Variation Operators in the NSGA-II Algorithm. In: Bielza, C., et al. Advances in Artificial Intelligence. CAEPIA 2013. Lecture Notes in Computer Science(), vol 8109. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40643-0_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-40643-0_28

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-40642-3

  • Online ISBN: 978-3-642-40643-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics