Skip to main content

Multiobjective Metamodel–Assisted Memetic Algorithms

  • Chapter
Multi-Objective Memetic Algorithms

Part of the book series: Studies in Computational Intelligence ((SCI,volume 171))

Abstract

The hybridization of global metaheuristics, such as evolutionary algorithms (EAs), and gradient-based methods for local search, in the framework of the so-called memetic algorithms (MAs) can be used to solve multi-objective optimization problems, either in the Lamarckian or Baldwinian spirit. Reducing the CPU cost of MAs is necessary for problems with computationally demanding evaluations. For the same purpose, in EAs, metamodels are in widespread use, giving rise to various kinds of metamodelassisted EAs (MAEAs). Metamodels are surrogate evaluation models of various types: multilayer perceptrons, radial basis function networks, polynomial regressions models, kriging, etc. A good practice is to use local metamodels, trained on the fly for each new individual, using selected entries from a database where all the previously evaluated offspring are recorded. The selection of suitable training patterns is important in order to minimize the prediction error of the metamodel. The MAEA developed by the authors in the past uses the inexact pre-evaluation (IPE) technique which starts after running a conventional EA for just a few generations on the exact evaluation model. The exactly evaluated offspring are all stored in the database. For the subsequent generations, local metamodels are trained for each new offspring to get an approximation to the objective functions so that, based on it, a few top individuals (in the Pareto front sense) are selected for exact re-evaluation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dawkins, R.: The selfish gene. Oxford University Press, Oxford (1976)

    Google Scholar 

  2. Goldberg, D.E.: Genetic algorithms in search, optimization & machine learning. Addison-Wesley, Reading (1989)

    MATH  Google Scholar 

  3. Haykin, S.: Neural networks: A comprehensive foundation. Prentice Hall, New Jersey (1999)

    MATH  Google Scholar 

  4. Knuth, D.: Fundamental algorithms. Addison-Wesley, Reading (1997)

    MATH  Google Scholar 

  5. Moscato, P.: Memetic algorithms: A short introduction. McGraw-Hill Company, New York (1999)

    Google Scholar 

  6. Myers, R.H., Montgomery, D.C.: Response surface methodology: Process and product optimization using designed experiments. John Wiley & Sons, New York (2002)

    Google Scholar 

  7. Nocedal, J., Wright, S.: Numerical optimization. Springer, New York (1999)

    MATH  Google Scholar 

  8. Tikhonov, A., Arsénine, V.: Méthodes de résolution de problèmes mal posés. Editions MIR, Moscou (1976)

    Google Scholar 

  9. Tikhonov, A.N., Goncharsky, A.V., Stepanov, V.V., Yagola, A.G.: Numerical methods for the solution of ill-posed problems. Kluwer Academic Publishers, Dordrecht (1995)

    MATH  Google Scholar 

  10. Hart, W.E.: Adaptive global optimization with local search, Ph.D. Dissertation 1994. University of California, San Diego (1994)

    Google Scholar 

  11. Krasnogor, N.: Studies on the theory and design space of memetic algorithms. Ph.D. Dissertation 2002, University of the West of England, Bristol (2002)

    Google Scholar 

  12. Land, M.W.S.: Evolutionary algorithms with local search for combinatorial optimization. Ph.D. Dissertation 1991, University of California, San Diego (1998)

    Google Scholar 

  13. Bishop, C.: Improving the generalisation properties of radial basis neural networks. Neural Computation 3(4), 579–588 (1991)

    Article  Google Scholar 

  14. Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Systems 2(3), 312–355 (1988)

    MathSciNet  Google Scholar 

  15. Drela, M., Giles, M.B.: Viscous-inviscid analysis of transonic and low Reynolds number airfoils. AIAA Journal 25(10), 1347–1355 (1987)

    Article  MATH  Google Scholar 

  16. Emmerich, M., Giannakoglou, K., Naujoks, B.: Single and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Trans. on Evolutionary Computation 10, 421–439 (2006)

    Article  Google Scholar 

  17. Fritzke, B.: Fast learning with incremental RBF networks. Neural Processing Letters 1, 2–5 (1994)

    Article  Google Scholar 

  18. Giles, M.B., Drela, M.: A two–dimensional transonic aerodynamic design method. AIAA Journal 25(9), 1199–1206 (1987)

    Article  Google Scholar 

  19. Giannakoglou, K.: Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence. International Review Journal Progress in Aerospace Sciences 38, 43–76 (2002)

    Article  Google Scholar 

  20. Grierson, D.E., Pak, W.H.: Optimal sizing, geometrical and topological design using a genetic algorithm. Structural and Multidisciplinary Optimization 6(3), 151–159 (1993)

    Google Scholar 

  21. Ishibuchi, H., Yoshida, T., Murata, T.: Balance between genetic search and local search in memetic algorithms for multiobjective permutation flowshop scheduling. IEEE Trans. on Evolutionary Computation 7, 204–223 (2003)

    Article  Google Scholar 

  22. Jaszkiewicz, A.: Genetic local search for multiobjective combinatorial optimization. European Journal of Operational Research 137(1), 50–71 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  23. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. on Evolutionary Computation 6(5), 481–494 (2002)

    Article  Google Scholar 

  24. Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing 9, 3–12 (2003)

    Article  Google Scholar 

  25. Karakasis, M., Koubogiannis, D., Giannakoglou, K.: Hierarchical distributed evolutionary algorithms in shape optimization. International Journal for Numerical Methods in Fluids 53, 455–469 (2007)

    Article  MATH  Google Scholar 

  26. Karakasis, M., Giannakoglou, K.: On the use of metamodel-assisted, multiobjective evolutionary algorithms. Engineering Optimization 38(8), 941–957 (2006)

    Article  MathSciNet  Google Scholar 

  27. Karakasis, M., Giannakoglou, K.: Inexact information aided, low-cost, distributed genetic algorithms for aerodynamic shape optimization. International Journal for Numerical Methods in Fluids 43, 1149–1166 (2003)

    Article  MATH  Google Scholar 

  28. Liang, K., Yao, X., Newton, C.: Evolutionary search of approximated N-dimensional landscapes. International Journal of Knowledge-Based Intelligent Engineering Systems 4, 172–183 (2000)

    Google Scholar 

  29. Moody, J., Darken, C.J.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 281–294 (1989)

    Article  Google Scholar 

  30. Ong, Y.S., Zhou, Z.Z., Nair, P.B., Keane, A.J., Lum, K.Y.: Combining global and local surrogate models to accelerate evolutionary optimization. IEEE Trans. on Systems, Man and Cybernetics, Part C: Applications and Reviews 37, 66–76 (2007)

    Article  Google Scholar 

  31. Ong, Y.S., Zhou, Z., Lim, M.H., Lee, B.S.: Memetic algorithm using multi–surrogates for computational expensive optimization problems. Journal of Soft Computing 11(10), 957–971 (2007)

    Article  Google Scholar 

  32. Park, J., Sandberg, I.W.: Universal approximation using radial basis function networks. Neural Computation 3(2), 246–257 (1991)

    Article  Google Scholar 

  33. Park, J., Sandberg, I.W.: Universal approximation using radial basis function networks. Neural Computation 3, 246–257 (1991)

    Article  Google Scholar 

  34. Poggio, T., Girosi, F.: Networks for approximation and learning. Proceedings of the IEEE 78(9), 1481–1497 (1990)

    Article  Google Scholar 

  35. Ratle, A.: Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 87–96. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  36. Regis, R.G., Shoemaker, C.A.: Local function approximation in evolutionary algorithms for the optimization of costly functions. IEEE Trans. on Evolutionary Computation 8(5), 490–505 (2004)

    Article  Google Scholar 

  37. Wang, G.G., Shan, S.: Review of metamodelling techniques in support of engineering design optimization. Trans. ASME, Journal of Mechanical Design 129(4), 370–380 (2006)

    Article  Google Scholar 

  38. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evolutionary Computation 8, 173–195 (2000)

    Article  Google Scholar 

  39. Bonataki, E., Georgoulis, L., Georgopoulou, C., Giannakoglou, K.: Optimal design of combined cycle power plants based on gas turbine performance data. In: ERCOFTAC Design Optimization: Methods & Applications, Athens (March 2004)

    Google Scholar 

  40. Dunne, R.A.: Smoothing the output of radial basis function networks. In: 5th Australian Conference on Neural Networks, Sydney (1994)

    Google Scholar 

  41. El-Beltagy, M.A., Nair, P.B., Keane, A.J.: Metamodeling techniques for evolutionary optimization of computationally expensive problems: Promises and limitations. In: GECCO 1999, Genetic and Evolutionary Computation Conference, Orlando (July 1999)

    Google Scholar 

  42. Federici, D.: Combining genes and memes to speed up evolution. In: CEC 2003, Congress on Evolutionary Computation, Canberra (December 2003)

    Google Scholar 

  43. Giannakoglou, K.: Designing turbomachinery blades using evolutionary methods. In: ASME Paper 99-GT-181, ASME Turbo Expo 1999, Indianapolis (June 1999)

    Google Scholar 

  44. Hacker, K.A.: Efficient global optimization using hybrid genetic algorithms. In: AIAA Paper 2002-5429, 9th AIAA//ISSMO Symposium on Multidisciplinary Analysis and Optimization, Atlanta, Georgia (September 2002)

    Google Scholar 

  45. Ishibuchi, H., Murata, T.: Multiobjective genetic local search algorithm. In: CEC 1996, Congress on Evolutionary Computation, Nagoya, Japan (May 1996)

    Google Scholar 

  46. Nair, P.B., Keane, A.J.: Combining approximation concepts with genetic algorithm–based structural optimization procedures. In: AIAA Paper 1998-1912, 39th AIAA/ASMEASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference, Long Beach, CA (April 1998)

    Google Scholar 

  47. Knowles, J.D., Come, D.W.: The Pareto Archived Evolution Strategy: A new baseline algorithm for multiobjective optimisation. In: CEC 1999, Congress on Evolutionary Computation, Piscataway, NJ (July 1999)

    Google Scholar 

  48. Knowles, J.D., Corne, D.W.: M-PAES: A memetic algorithm for multiobjective optimization. In: CEC 1996, Congress on Evolutionary Computation, San Diego, CA (July 2000)

    Google Scholar 

  49. Knowles, J.D., Corne, D.W.: A comparison of diverse approaches to memetic multiobjective combinatorial optimization. In: GECCO 2000, Genetic and Evolutionary Computation Conference, Las Vegas, NV (July 2000)

    Google Scholar 

  50. Knowles, J.D., Corne, D.W.: A comparative assessment of memetic, evolutionary, and constructive algorithms for the multiobjective d-MST problem. In: GECCO 2001, Genetic and Evolutionary Computation Conference, San Francisco, CA (July 2001)

    Google Scholar 

  51. Ku, K., Mak, M.: Exploring the effects of Lamarckian and Baldwinian learning in evolving recurrent neural networks. In: IEEE International Conference on Evolutionary Computation, Indianapolis (April 1997)

    Google Scholar 

  52. Liang, K., Yao, X., Newton, C.: Combining landscape approximation and local search in global optimization. In: CEC 1999, Congress on Evolutionary Computation, Washington (July 1999)

    Google Scholar 

  53. Papila, N., Shyy, W., Fitz-Coy, N., Haftka, R.T.: Assessment of neural net and polynomial-based techniques for aerodynamic applications. In: AIAA Paper 1999-3167, 17th AIAA Applied Aerodynamics Conference, Norfolk, VA (1999)

    Google Scholar 

  54. Ratle, A.: Optimal sampling strategies for learning a fitness model. In: CEC 1999, Congress on Evolutionary Computation, Piscataway, NJ (July 1999)

    Google Scholar 

  55. Benoudjit, N., Archambeau, C., Lendasse, A., Lee, J., Verleysen, M.: Width optimization of the Gaussian kernels in radial basis function networks. In: ESANN Paper, 10th European Symposium on Artificial Neural Networks, Bruges (2002)

    Google Scholar 

  56. Moscato, P.: On evolution, search, optimization, genetic algorithms and martial arts: Towards memetic algorithms. In: C3P 826, Pasadena, CA (1989)

    Google Scholar 

  57. Orr, M.J.L.: Regularised centre recruitment in radial basis function networks. In: Research Paper 59, Centre For Cognitive Science, University of Edinburgh (1993)

    Google Scholar 

  58. Zitzler, E., Laumans, M., Thiele, L.: SPEA2: Improving the Strength Pareto Evolutionary Algorithm for multiobjective optimization. TIK–Report 103, Zurich (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Georgopoulou, C.A., Giannakoglou, K.C. (2009). Multiobjective Metamodel–Assisted Memetic Algorithms. In: Goh, CK., Ong, YS., Tan, K.C. (eds) Multi-Objective Memetic Algorithms. Studies in Computational Intelligence, vol 171. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-88051-6_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-88051-6_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-88050-9

  • Online ISBN: 978-3-540-88051-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics