A Surrogate-Based Intelligent Variation Operator for Multiobjective Optimization

  • Alan Díaz-Manríquez
  • Gregorio Toscano-Pulido
  • Ricardo Landa-Becerra
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7401)


Evolutionary algorithms are meta-heuristics that have shown flexibility, adaptability and good performance when solving Multiobjective Optimization Problems (MOPs). However, in order to achieve acceptable results, Multiobjective Evolutionary Algorithms (MOEAs) usually require several evaluations of the optimization function. Moreover, when each of these evaluations represents a high computational cost, these expensive problems remain intractable even by these meta-heuristics. To reduce the computational cost in expensive optimization problems, some researchers have replaced the real optimization function with a computationally inexpensive surrogate model. In this paper, we propose a new intelligent variation operator which is based on surrogate models. The operator is incorporated into a stand-alone search mechanism in order to perform its validation. Results indicate that the proposed algorithm can be used to optimize MOPs. However, it presents premature convergence when optimizing multifrontal MOPs. Therefore, in order to solve this drawback, the proposed operator was successfully hybridized with a MOEA. Results show that this latter approach outperformed both, the former proposed algorithm and the evolutionary algorithm but without the operator.


Evolutionary Algorithms Intelligent Genetic Variation Operator Multiobjective Optimization 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Anderson, K.S., Hsu, Y.: Genetic Crossover Strategy Using an Approximation Concept. In: Angeline, P.J., Michalewicz, Z., Schoenauer, M., Yao, X., Zalzala, A. (eds.) Proceedings of the 1999 Congress on Evolutionary Computation, CEC 1999, Washington, D.C., USA, vol. 1, p. 533 (1999)Google Scholar
  2. 2.
    Emmerich, M., Giannakoglou, K., Naujoks, B.: Single and Multiobjective Evolutionary Optimization Assisted by Gaussian Random Field Metamodels. IEEE Transactions on Evolutionary Computation 10(4), 421–439 (2006)CrossRefGoogle Scholar
  3. 3.
    Jun, S., Shigenobu, K.: Extrapolation-Directed Crossover for Real-coded GA: Overcoming Deceptive Phenomena by Extrapolative Search. In: Proceedings of the 2001 Congress on Evolutionary Computation (CEC 2001), Seoul, Korea, vol. 1, pp. 655–662. IEEE Press (2001)Google Scholar
  4. 4.
    McKay, M.D., Beckman, R.J., Conover, W.J.: A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer Code. Technometrics 21(2), 239–245 (1979)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Santana-Quintero, L.V., Serrano-Hernandez, V.A., Coello, C.A., Hernández-Díaz, A.G., Molina, J.: Use of Radial Basis Functions and Rough Sets for Evolutionary Multi-Objective Optimization. In: Proceedings of the 2007 IEEE Symposium on Computational Intelligence in Multicriteria Decision Making (MCDM 2007), Honolulu, Hawaii, USA, pp. 107–114. IEEE Press (April 2007)Google Scholar
  6. 6.
    Talukder, A.K.M.K.A., Kirley, M., Buyya, R.: The Pareto-Following Variation Operator as an Alternative Approximation Model. In: IEEE Congress on Evolutionary Computation, CEC 2009, pp. 8–15 (May 2009)Google Scholar
  7. 7.
    Veldhuizen, D.A.V., Lamont, G.B.: Multiobjective Evolutionary Algorithm Research: A History and Analysis. Technical Report TR-98-03, Department of Electrical and Computer Engineering, Graduate School of Engineering, Air Force Institute of Technology, Wright-Patterson AFB, Ohio (1998)Google Scholar
  8. 8.
    Veldhuizen, D.A.V., Lamont, G.B.: On Measuring Multiobjective Evolutionary Algorithm Performance. In: 2000 Congress on Evolutionary Computation, Piscataway, New Jersey, vol. 1, pp. 204–211. IEEE Service Center (July 2000)Google Scholar
  9. 9.
    Li, Q., Liao, X., Yang, X., Zhang, W., Li, W.: Multiobjective optimization for crash safety design of vehicles using stepwise regression model. Structural and Multidisciplinary Optimization 35, 561–569 (2008)CrossRefGoogle Scholar
  10. 10.
    Zhang, Q., Sun, J., Tsang, E.: An Evolutionary Algorithm with Guided Mutation for the Maximum Clique Problem. IEEE Transactions on Evolutionary Computation 9(2), 192–200 (2005)CrossRefGoogle Scholar
  11. 11.
    Zhou, Q., Li, Y.: Directed Variation in Evolution Strategies. IEEE Transactions on Evolutionary Computation 7(4), 356–366 (2003)CrossRefGoogle Scholar
  12. 12.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of Multiobjective Evolutionary Algorithms: Empirical Results. Evolutionary Computation 8(2), 173–195 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Alan Díaz-Manríquez
    • 1
  • Gregorio Toscano-Pulido
    • 1
  • Ricardo Landa-Becerra
    • 1
  1. 1.Information Technology LaboratoryCINVESTAV-Tamaulipas Parque Científico y Tecnológico TECNOTAMCd. VictoriaMéxico

Personalised recommendations