Abstract
The hybridization of global metaheuristics, such as evolutionary algorithms (EAs), and gradient-based methods for local search, in the framework of the so-called memetic algorithms (MAs) can be used to solve multi-objective optimization problems, either in the Lamarckian or Baldwinian spirit. Reducing the CPU cost of MAs is necessary for problems with computationally demanding evaluations. For the same purpose, in EAs, metamodels are in widespread use, giving rise to various kinds of metamodelassisted EAs (MAEAs). Metamodels are surrogate evaluation models of various types: multilayer perceptrons, radial basis function networks, polynomial regressions models, kriging, etc. A good practice is to use local metamodels, trained on the fly for each new individual, using selected entries from a database where all the previously evaluated offspring are recorded. The selection of suitable training patterns is important in order to minimize the prediction error of the metamodel. The MAEA developed by the authors in the past uses the inexact pre-evaluation (IPE) technique which starts after running a conventional EA for just a few generations on the exact evaluation model. The exactly evaluated offspring are all stored in the database. For the subsequent generations, local metamodels are trained for each new offspring to get an approximation to the objective functions so that, based on it, a few top individuals (in the Pareto front sense) are selected for exact re-evaluation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Dawkins, R.: The selfish gene. Oxford University Press, Oxford (1976)
Goldberg, D.E.: Genetic algorithms in search, optimization & machine learning. Addison-Wesley, Reading (1989)
Haykin, S.: Neural networks: A comprehensive foundation. Prentice Hall, New Jersey (1999)
Knuth, D.: Fundamental algorithms. Addison-Wesley, Reading (1997)
Moscato, P.: Memetic algorithms: A short introduction. McGraw-Hill Company, New York (1999)
Myers, R.H., Montgomery, D.C.: Response surface methodology: Process and product optimization using designed experiments. John Wiley & Sons, New York (2002)
Nocedal, J., Wright, S.: Numerical optimization. Springer, New York (1999)
Tikhonov, A., Arsénine, V.: Méthodes de résolution de problèmes mal posés. Editions MIR, Moscou (1976)
Tikhonov, A.N., Goncharsky, A.V., Stepanov, V.V., Yagola, A.G.: Numerical methods for the solution of ill-posed problems. Kluwer Academic Publishers, Dordrecht (1995)
Hart, W.E.: Adaptive global optimization with local search, Ph.D. Dissertation 1994. University of California, San Diego (1994)
Krasnogor, N.: Studies on the theory and design space of memetic algorithms. Ph.D. Dissertation 2002, University of the West of England, Bristol (2002)
Land, M.W.S.: Evolutionary algorithms with local search for combinatorial optimization. Ph.D. Dissertation 1991, University of California, San Diego (1998)
Bishop, C.: Improving the generalisation properties of radial basis neural networks. Neural Computation 3(4), 579–588 (1991)
Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Systems 2(3), 312–355 (1988)
Drela, M., Giles, M.B.: Viscous-inviscid analysis of transonic and low Reynolds number airfoils. AIAA Journal 25(10), 1347–1355 (1987)
Emmerich, M., Giannakoglou, K., Naujoks, B.: Single and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Trans. on Evolutionary Computation 10, 421–439 (2006)
Fritzke, B.: Fast learning with incremental RBF networks. Neural Processing Letters 1, 2–5 (1994)
Giles, M.B., Drela, M.: A two–dimensional transonic aerodynamic design method. AIAA Journal 25(9), 1199–1206 (1987)
Giannakoglou, K.: Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence. International Review Journal Progress in Aerospace Sciences 38, 43–76 (2002)
Grierson, D.E., Pak, W.H.: Optimal sizing, geometrical and topological design using a genetic algorithm. Structural and Multidisciplinary Optimization 6(3), 151–159 (1993)
Ishibuchi, H., Yoshida, T., Murata, T.: Balance between genetic search and local search in memetic algorithms for multiobjective permutation flowshop scheduling. IEEE Trans. on Evolutionary Computation 7, 204–223 (2003)
Jaszkiewicz, A.: Genetic local search for multiobjective combinatorial optimization. European Journal of Operational Research 137(1), 50–71 (2002)
Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. on Evolutionary Computation 6(5), 481–494 (2002)
Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing 9, 3–12 (2003)
Karakasis, M., Koubogiannis, D., Giannakoglou, K.: Hierarchical distributed evolutionary algorithms in shape optimization. International Journal for Numerical Methods in Fluids 53, 455–469 (2007)
Karakasis, M., Giannakoglou, K.: On the use of metamodel-assisted, multiobjective evolutionary algorithms. Engineering Optimization 38(8), 941–957 (2006)
Karakasis, M., Giannakoglou, K.: Inexact information aided, low-cost, distributed genetic algorithms for aerodynamic shape optimization. International Journal for Numerical Methods in Fluids 43, 1149–1166 (2003)
Liang, K., Yao, X., Newton, C.: Evolutionary search of approximated N-dimensional landscapes. International Journal of Knowledge-Based Intelligent Engineering Systems 4, 172–183 (2000)
Moody, J., Darken, C.J.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 281–294 (1989)
Ong, Y.S., Zhou, Z.Z., Nair, P.B., Keane, A.J., Lum, K.Y.: Combining global and local surrogate models to accelerate evolutionary optimization. IEEE Trans. on Systems, Man and Cybernetics, Part C: Applications and Reviews 37, 66–76 (2007)
Ong, Y.S., Zhou, Z., Lim, M.H., Lee, B.S.: Memetic algorithm using multi–surrogates for computational expensive optimization problems. Journal of Soft Computing 11(10), 957–971 (2007)
Park, J., Sandberg, I.W.: Universal approximation using radial basis function networks. Neural Computation 3(2), 246–257 (1991)
Park, J., Sandberg, I.W.: Universal approximation using radial basis function networks. Neural Computation 3, 246–257 (1991)
Poggio, T., Girosi, F.: Networks for approximation and learning. Proceedings of the IEEE 78(9), 1481–1497 (1990)
Ratle, A.: Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 87–96. Springer, Heidelberg (1998)
Regis, R.G., Shoemaker, C.A.: Local function approximation in evolutionary algorithms for the optimization of costly functions. IEEE Trans. on Evolutionary Computation 8(5), 490–505 (2004)
Wang, G.G., Shan, S.: Review of metamodelling techniques in support of engineering design optimization. Trans. ASME, Journal of Mechanical Design 129(4), 370–380 (2006)
Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evolutionary Computation 8, 173–195 (2000)
Bonataki, E., Georgoulis, L., Georgopoulou, C., Giannakoglou, K.: Optimal design of combined cycle power plants based on gas turbine performance data. In: ERCOFTAC Design Optimization: Methods & Applications, Athens (March 2004)
Dunne, R.A.: Smoothing the output of radial basis function networks. In: 5th Australian Conference on Neural Networks, Sydney (1994)
El-Beltagy, M.A., Nair, P.B., Keane, A.J.: Metamodeling techniques for evolutionary optimization of computationally expensive problems: Promises and limitations. In: GECCO 1999, Genetic and Evolutionary Computation Conference, Orlando (July 1999)
Federici, D.: Combining genes and memes to speed up evolution. In: CEC 2003, Congress on Evolutionary Computation, Canberra (December 2003)
Giannakoglou, K.: Designing turbomachinery blades using evolutionary methods. In: ASME Paper 99-GT-181, ASME Turbo Expo 1999, Indianapolis (June 1999)
Hacker, K.A.: Efficient global optimization using hybrid genetic algorithms. In: AIAA Paper 2002-5429, 9th AIAA//ISSMO Symposium on Multidisciplinary Analysis and Optimization, Atlanta, Georgia (September 2002)
Ishibuchi, H., Murata, T.: Multiobjective genetic local search algorithm. In: CEC 1996, Congress on Evolutionary Computation, Nagoya, Japan (May 1996)
Nair, P.B., Keane, A.J.: Combining approximation concepts with genetic algorithm–based structural optimization procedures. In: AIAA Paper 1998-1912, 39th AIAA/ASMEASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference, Long Beach, CA (April 1998)
Knowles, J.D., Come, D.W.: The Pareto Archived Evolution Strategy: A new baseline algorithm for multiobjective optimisation. In: CEC 1999, Congress on Evolutionary Computation, Piscataway, NJ (July 1999)
Knowles, J.D., Corne, D.W.: M-PAES: A memetic algorithm for multiobjective optimization. In: CEC 1996, Congress on Evolutionary Computation, San Diego, CA (July 2000)
Knowles, J.D., Corne, D.W.: A comparison of diverse approaches to memetic multiobjective combinatorial optimization. In: GECCO 2000, Genetic and Evolutionary Computation Conference, Las Vegas, NV (July 2000)
Knowles, J.D., Corne, D.W.: A comparative assessment of memetic, evolutionary, and constructive algorithms for the multiobjective d-MST problem. In: GECCO 2001, Genetic and Evolutionary Computation Conference, San Francisco, CA (July 2001)
Ku, K., Mak, M.: Exploring the effects of Lamarckian and Baldwinian learning in evolving recurrent neural networks. In: IEEE International Conference on Evolutionary Computation, Indianapolis (April 1997)
Liang, K., Yao, X., Newton, C.: Combining landscape approximation and local search in global optimization. In: CEC 1999, Congress on Evolutionary Computation, Washington (July 1999)
Papila, N., Shyy, W., Fitz-Coy, N., Haftka, R.T.: Assessment of neural net and polynomial-based techniques for aerodynamic applications. In: AIAA Paper 1999-3167, 17th AIAA Applied Aerodynamics Conference, Norfolk, VA (1999)
Ratle, A.: Optimal sampling strategies for learning a fitness model. In: CEC 1999, Congress on Evolutionary Computation, Piscataway, NJ (July 1999)
Benoudjit, N., Archambeau, C., Lendasse, A., Lee, J., Verleysen, M.: Width optimization of the Gaussian kernels in radial basis function networks. In: ESANN Paper, 10th European Symposium on Artificial Neural Networks, Bruges (2002)
Moscato, P.: On evolution, search, optimization, genetic algorithms and martial arts: Towards memetic algorithms. In: C3P 826, Pasadena, CA (1989)
Orr, M.J.L.: Regularised centre recruitment in radial basis function networks. In: Research Paper 59, Centre For Cognitive Science, University of Edinburgh (1993)
Zitzler, E., Laumans, M., Thiele, L.: SPEA2: Improving the Strength Pareto Evolutionary Algorithm for multiobjective optimization. TIK–Report 103, Zurich (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Georgopoulou, C.A., Giannakoglou, K.C. (2009). Multiobjective Metamodel–Assisted Memetic Algorithms. In: Goh, CK., Ong, YS., Tan, K.C. (eds) Multi-Objective Memetic Algorithms. Studies in Computational Intelligence, vol 171. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-88051-6_8
Download citation
DOI: https://doi.org/10.1007/978-3-540-88051-6_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-88050-9
Online ISBN: 978-3-540-88051-6
eBook Packages: EngineeringEngineering (R0)