Abstract
Simulation-based design optimization relies on computational models to evaluate objective and constraint functions. Typical challenges of solving simulation-based design optimization problems include unavailable gradients or unreliable approximations thereof, excessive computational cost, numerical noise, multi-modality and even the models’ failure to return a value. It has become common to use the term “blackbox” for a computational model that features any of these characteristics and/or is inaccessible by the design engineer (i.e., cannot be modified directly to address these issues). A possible remedy for dealing with blackboxes is to use surrogate-based derivative-free optimization methods. However, this has to be done carefully using appropriate formulations and algorithms. In this work, we use the R dynaTree package to build statistical surrogates of the blackboxes and the direct search method for derivative-free optimization. We present different formulations for the surrogate problem considered at each search step of the Mesh Adaptive Direct Search (MADS) algorithm using a surrogate management framework. The proposed formulations are tested on two simulation-based multidisciplinary design optimization problems. Numerical results confirm that the use of statistical surrogates in MADS improves the efficiency of the optimization algorithm.
We are honored to have this work appear in a book dedicated to the memory of Professor M.G. Karlaftis
This material was presented by the third author in an invited semi-plenary lecture at OPT-i 2014.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
There may be situations where the properties of the objective function or some of the constraints do not require the construction and use of surrogate models, e.g., if one of these functions is smooth and inexpensive and has an analytical expression.
References
Abramson MA, Audet C, Couture G, Dennis JE, Jr, Le Digabel S Tribes C, The NOMAD project. https://www.gerad.ca/nomad
AIAA/UTC/Pratt & Whitney. Undergraduate individual aircraft design competition, 1995/1996
Audet C, Dennis JE Jr (2003) Analysis of generalized pattern searches. SIAM J Optim 13(3):889–903
Audet C, Dennis JE Jr (2006) Mesh adaptive direct search algorithms for constrained optimization. SIAM J Optim 17(1):188–217
Audet C, Béchard V, Le Digabel S (2008) Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search. J Glob Optim 41(2):299–318
Bai Z (2002) Krylov subspace techniques for reduced-order modeling of large-scale dynamical systems. Appl Numer Math 43(1–2):9–44
Bandler JW, Cheng QS, Dakroury SA, Mohamed AS, Bakr MH, Madsen K, Sondergaard J (2004) Space mapping: the state of the art. IEEE Trans Microw Theory Tech 52(1):337–361
Booker AJ, Dennis JE Jr, Frank PD, Serafini DB, Torczon V, Trosset MW (1999) A rigorous framework for optimization of expensive functions by surrogates. Struct Multi Optim 17(1):1–13
Carvalho CM, Johannes M, Lopes HF, Polson NG (2010) Particle learning and smoothing. Stat Sci 25(1):88–106
Carvalho CM, Lopes HF, Polson NG, Taddy MA (2010) Particle learning for general mixtures. Bayesian Anal 5(4):709–740
Chipman HA, George EI, McCulloch RE (1998) Bayesian CART model search (with discussion). J Am Stat Assoc 93(443):935–960
Chipman HA, George EI, McCulloch RE (2002) Bayesian treed models. Mach Learn 48(1–3):299–320
Clarke FH Optimization and nonsmooth analysis. Wiley, New York, 1983. Reissued in 1990 by SIAM Publications, Philadelphia, as, vol 5 in the series Classics in Applied Mathematics
Cohn DA (1996) Neural network exploration using optimal experimental design. Adv Neural Inf Process Syst 6(9):679–686
Conn AR, Scheinberg K, Vicente LN (2009) Introduction to derivative-free optimization. MOS/SIAM series on optimization. SIAM, Philadelphia
Conn AR, Le Digabel S (2013) Use of quadratic models with mesh-adaptive direct search for constrained black box optimization. Optim Methods Softw 28(1):139–158
Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297
Custódio AL, Rocha H, Vicente LN (2010) Incorporating minimum Frobenius norm models in direct search. Comput Optim Appl 46(2):265–278
Fletcher R, Leyffer S (2002) Nonlinear programming without a penalty function. Math Program Ser A 91:239–269
Forrester AIJ, Keane AJ (2009) Recent advances in surrogate-based optimization. Prog Aerosp Sci 45(1–3):50–79
Goldberg DE (1989) Genetic algorithms in search. optimization and machine learning. Wesley, Boston
Gramacy RB, Le Digabel S (2011) The mesh adaptive direct search algorithm with treed Gaussian process surrogates. Technical Report G-2011-37, Les cahiers du GERAD, 2011. To appear in the Pac J Optim
Gramacy RB, Taddy MA (2010) dynaTree: An R package implementing dynamic trees for learning and design. Software available at http://CRAN.R-project.org/package=dynaTree
Gramacy RB, Lee HKH (2008) Bayesian treed Gaussian process models with an application to computer modeling. J Am Stat Assoc 103(483):1119–1130
Gramacy RB, Taddy MA, Wild SM (2013) Variable selection and sensitivity analysis using dynamic trees, with an application to computer code performance tuning. Ann Appl Stat 7(1):51–80
Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black box functions. J Glob Optim 13(4):455–492
Jones DR (2001) A taxonomy of global optimization methods based on response surfaces. J Glob Optim 21:345–383
Kodiyalam S (2001) Multidisciplinary aerospace systems optimization. Technical Report NASA/CR-2001-211053, Lockheed Martin Space Systems Company, Computational AeroSciences Project, Sunnyvale, CA
Krige DG (1951) A statistical approach to some mine valuations and allied problems at the Witwatersrand. Master’s thesis, University of Witwatersrand
Le Digabel S (2011) Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Trans Math Softw 37(4):44:1–44:15
Liem RP (2007) Surrogate modeling for large-scale black-box systems. Master’s thesis, School of Engineering, Computation for Design and Optimization Program
McKay MD, Beckman RJ, Conover WJ (1979) A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2):239–245
Moré JJ, Wild SM (2009) Benchmarking derivative-free optimization algorithms. SIAM J Optim 20(1):172–191
Queipo N, Haftka R, Shyy W, Goel T, Vaidyanathan R, Kevintucker P (2005) Surrogate-based analysis and optimization. Prog Aerosp Sci 41(1):1–28
Rasmussen CE, Williams CKI (2006) Gaussian processes for machine learning. The MIT Press, Cambridge
Schonlau M, Jones DR, Welch WJ (1998) Global versus local search in constrained optimization of computer models. In: New developments and applications in experimental design, number 34 in IMS Lecture Notes–Monograph Series, pp 11–25. Institute of Mathematical Statistics
Serafini DB (1998) A framework for managing models in nonlinear optimization of computationally expensive functions. Ph.D. thesis, Department of Computational and Applied Mathematics, Rice University, Houston, Texas
Simpson TW, Korte JJ, Mauery TM, Mistree F (2001) Kriging models for global approximation in simulation-based multidisciplinary design optimization. AIAA J 39(12):2233–2241
Taddy MA, Gramacy RB, Polson NG (2011) Dynamic trees for learning and design. J Am Stat Assoc 106(493):109–123
Torczon V (1997) On the convergence of pattern search algorithms. SIAM J Optim 7(1):1–25
Tribes C, Dubé J-F, Trépanier J-Y (2005) Decomposition of multidisciplinary optimization problems: formulations and application to a simplified wing design. Eng Optim 37(8):775–796
Vaz AIF, Vicente LN (2007) A particle swarm pattern search method for bound constrained global optimization. J Glob Optim 39(2):197–219
Willcox K, Peraire J (2002) Balanced model reduction via the proper orthogonal decomposition. AIAA J 40(11):2323–2330
Williams BJ, Santner TJ, Notz WI (2000) Sequential design of computer experiments to minimize integrated response functions. Stat Sin 10(4):1133–1152
Acknowledgments
This work was partially supported by NSERC Discovery Grants 418250-2012 and 436193-2013 and by a GERAD postdoctoral fellowship; such support does not constitute an endorsement by the sponsors of the opinions expressed in this chapter. The authors would like to thank Prof. Charles Audet of GERAD and École Polytéchnique for his useful comments and Prof. Robert Gramacy of the University of Chicago for his help with implementing dynaTree within NOMAD.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Talgorn, B., Sébastien, L., Kokkolaras, M. (2015). Blackbox Optimization in Engineering Design: Adaptive Statistical Surrogates and Direct Search Algorithms. In: Lagaros, N., Papadrakakis, M. (eds) Engineering and Applied Sciences Optimization. Computational Methods in Applied Sciences, vol 38. Springer, Cham. https://doi.org/10.1007/978-3-319-18320-6_19
Download citation
DOI: https://doi.org/10.1007/978-3-319-18320-6_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-18319-0
Online ISBN: 978-3-319-18320-6
eBook Packages: EngineeringEngineering (R0)