Skip to main content

Blackbox Optimization in Engineering Design: Adaptive Statistical Surrogates and Direct Search Algorithms

  • Chapter
  • First Online:
Engineering and Applied Sciences Optimization

Part of the book series: Computational Methods in Applied Sciences ((COMPUTMETHODS,volume 38))

Abstract

Simulation-based design optimization relies on computational models to evaluate objective and constraint functions. Typical challenges of solving simulation-based design optimization problems include unavailable gradients or unreliable approximations thereof, excessive computational cost, numerical noise, multi-modality and even the models’ failure to return a value. It has become common to use the term “blackbox” for a computational model that features any of these characteristics and/or is inaccessible by the design engineer (i.e., cannot be modified directly to address these issues). A possible remedy for dealing with blackboxes is to use surrogate-based derivative-free optimization methods. However, this has to be done carefully using appropriate formulations and algorithms. In this work, we use the R dynaTree package to build statistical surrogates of the blackboxes and the direct search method for derivative-free optimization. We present different formulations for the surrogate problem considered at each search step of the Mesh Adaptive Direct Search (MADS) algorithm using a surrogate management framework. The proposed formulations are tested on two simulation-based multidisciplinary design optimization problems. Numerical results confirm that the use of statistical surrogates in MADS improves the efficiency of the optimization algorithm.

We are honored to have this work appear in a book dedicated to the memory of Professor M.G. Karlaftis

This material was presented by the third author in an invited semi-plenary lecture at OPT-i 2014.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    There may be situations where the properties of the objective function or some of the constraints do not require the construction and use of surrogate models, e.g., if one of these functions is smooth and inexpensive and has an analytical expression.

References

  1. Abramson MA, Audet C, Couture G, Dennis JE, Jr, Le Digabel S Tribes C, The NOMAD project. https://www.gerad.ca/nomad

  2. AIAA/UTC/Pratt & Whitney. Undergraduate individual aircraft design competition, 1995/1996

    Google Scholar 

  3. Audet C, Dennis JE Jr (2003) Analysis of generalized pattern searches. SIAM J Optim 13(3):889–903

    Article  MATH  MathSciNet  Google Scholar 

  4. Audet C, Dennis JE Jr (2006) Mesh adaptive direct search algorithms for constrained optimization. SIAM J Optim 17(1):188–217

    Article  MATH  MathSciNet  Google Scholar 

  5. Audet C, Béchard V, Le Digabel S (2008) Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search. J Glob Optim 41(2):299–318

    Article  MATH  Google Scholar 

  6. Bai Z (2002) Krylov subspace techniques for reduced-order modeling of large-scale dynamical systems. Appl Numer Math 43(1–2):9–44

    Article  MATH  MathSciNet  Google Scholar 

  7. Bandler JW, Cheng QS, Dakroury SA, Mohamed AS, Bakr MH, Madsen K, Sondergaard J (2004) Space mapping: the state of the art. IEEE Trans Microw Theory Tech 52(1):337–361

    Article  Google Scholar 

  8. Booker AJ, Dennis JE Jr, Frank PD, Serafini DB, Torczon V, Trosset MW (1999) A rigorous framework for optimization of expensive functions by surrogates. Struct Multi Optim 17(1):1–13

    Google Scholar 

  9. Carvalho CM, Johannes M, Lopes HF, Polson NG (2010) Particle learning and smoothing. Stat Sci 25(1):88–106

    Article  MathSciNet  Google Scholar 

  10. Carvalho CM, Lopes HF, Polson NG, Taddy MA (2010) Particle learning for general mixtures. Bayesian Anal 5(4):709–740

    Article  MathSciNet  Google Scholar 

  11. Chipman HA, George EI, McCulloch RE (1998) Bayesian CART model search (with discussion). J Am Stat Assoc 93(443):935–960

    Article  Google Scholar 

  12. Chipman HA, George EI, McCulloch RE (2002) Bayesian treed models. Mach Learn 48(1–3):299–320

    Google Scholar 

  13. Clarke FH Optimization and nonsmooth analysis. Wiley, New York, 1983. Reissued in 1990 by SIAM Publications, Philadelphia, as, vol 5 in the series Classics in Applied Mathematics

    Google Scholar 

  14. Cohn DA (1996) Neural network exploration using optimal experimental design. Adv Neural Inf Process Syst 6(9):679–686

    Google Scholar 

  15. Conn AR, Scheinberg K, Vicente LN (2009) Introduction to derivative-free optimization. MOS/SIAM series on optimization. SIAM, Philadelphia

    Book  Google Scholar 

  16. Conn AR, Le Digabel S (2013) Use of quadratic models with mesh-adaptive direct search for constrained black box optimization. Optim Methods Softw 28(1):139–158

    Article  MATH  MathSciNet  Google Scholar 

  17. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  18. Custódio AL, Rocha H, Vicente LN (2010) Incorporating minimum Frobenius norm models in direct search. Comput Optim Appl 46(2):265–278

    Article  MATH  MathSciNet  Google Scholar 

  19. Fletcher R, Leyffer S (2002) Nonlinear programming without a penalty function. Math Program Ser A 91:239–269

    Article  MATH  MathSciNet  Google Scholar 

  20. Forrester AIJ, Keane AJ (2009) Recent advances in surrogate-based optimization. Prog Aerosp Sci 45(1–3):50–79

    Article  Google Scholar 

  21. Goldberg DE (1989) Genetic algorithms in search. optimization and machine learning. Wesley, Boston

    MATH  Google Scholar 

  22. Gramacy RB, Le Digabel S (2011) The mesh adaptive direct search algorithm with treed Gaussian process surrogates. Technical Report G-2011-37, Les cahiers du GERAD, 2011. To appear in the Pac J Optim

    Google Scholar 

  23. Gramacy RB, Taddy MA (2010) dynaTree: An R package implementing dynamic trees for learning and design. Software available at http://CRAN.R-project.org/package=dynaTree

  24. Gramacy RB, Lee HKH (2008) Bayesian treed Gaussian process models with an application to computer modeling. J Am Stat Assoc 103(483):1119–1130

    Article  MATH  MathSciNet  Google Scholar 

  25. Gramacy RB, Taddy MA, Wild SM (2013) Variable selection and sensitivity analysis using dynamic trees, with an application to computer code performance tuning. Ann Appl Stat 7(1):51–80

    Article  MATH  MathSciNet  Google Scholar 

  26. Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black box functions. J Glob Optim 13(4):455–492

    Article  MATH  MathSciNet  Google Scholar 

  27. Jones DR (2001) A taxonomy of global optimization methods based on response surfaces. J Glob Optim 21:345–383

    Google Scholar 

  28. Kodiyalam S (2001) Multidisciplinary aerospace systems optimization. Technical Report NASA/CR-2001-211053, Lockheed Martin Space Systems Company, Computational AeroSciences Project, Sunnyvale, CA

    Google Scholar 

  29. Krige DG (1951) A statistical approach to some mine valuations and allied problems at the Witwatersrand. Master’s thesis, University of Witwatersrand

    Google Scholar 

  30. Le Digabel S (2011) Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Trans Math Softw 37(4):44:1–44:15

    Google Scholar 

  31. Liem RP (2007) Surrogate modeling for large-scale black-box systems. Master’s thesis, School of Engineering, Computation for Design and Optimization Program

    Google Scholar 

  32. McKay MD, Beckman RJ, Conover WJ (1979) A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2):239–245

    MATH  MathSciNet  Google Scholar 

  33. Moré JJ, Wild SM (2009) Benchmarking derivative-free optimization algorithms. SIAM J Optim 20(1):172–191

    Article  MATH  MathSciNet  Google Scholar 

  34. Queipo N, Haftka R, Shyy W, Goel T, Vaidyanathan R, Kevintucker P (2005) Surrogate-based analysis and optimization. Prog Aerosp Sci 41(1):1–28

    Article  Google Scholar 

  35. Rasmussen CE, Williams CKI (2006) Gaussian processes for machine learning. The MIT Press, Cambridge

    Google Scholar 

  36. Schonlau M, Jones DR, Welch WJ (1998) Global versus local search in constrained optimization of computer models. In: New developments and applications in experimental design, number 34 in IMS Lecture Notes–Monograph Series, pp 11–25. Institute of Mathematical Statistics

    Google Scholar 

  37. Serafini DB (1998) A framework for managing models in nonlinear optimization of computationally expensive functions. Ph.D. thesis, Department of Computational and Applied Mathematics, Rice University, Houston, Texas

    Google Scholar 

  38. Simpson TW, Korte JJ, Mauery TM, Mistree F (2001) Kriging models for global approximation in simulation-based multidisciplinary design optimization. AIAA J 39(12):2233–2241

    Article  Google Scholar 

  39. Taddy MA, Gramacy RB, Polson NG (2011) Dynamic trees for learning and design. J Am Stat Assoc 106(493):109–123

    Article  MathSciNet  Google Scholar 

  40. Torczon V (1997) On the convergence of pattern search algorithms. SIAM J Optim 7(1):1–25

    Article  MATH  MathSciNet  Google Scholar 

  41. Tribes C, Dubé J-F, Trépanier J-Y (2005) Decomposition of multidisciplinary optimization problems: formulations and application to a simplified wing design. Eng Optim 37(8):775–796

    Google Scholar 

  42. Vaz AIF, Vicente LN (2007) A particle swarm pattern search method for bound constrained global optimization. J Glob Optim 39(2):197–219

    Article  MATH  MathSciNet  Google Scholar 

  43. Willcox K, Peraire J (2002) Balanced model reduction via the proper orthogonal decomposition. AIAA J 40(11):2323–2330

    Article  Google Scholar 

  44. Williams BJ, Santner TJ, Notz WI (2000) Sequential design of computer experiments to minimize integrated response functions. Stat Sin 10(4):1133–1152

    MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

This work was partially supported by NSERC Discovery Grants 418250-2012 and 436193-2013 and by a GERAD postdoctoral fellowship; such support does not constitute an endorsement by the sponsors of the opinions expressed in this chapter. The authors would like to thank Prof. Charles Audet of GERAD and École Polytéchnique for his useful comments and Prof. Robert Gramacy of the University of Chicago for his help with implementing dynaTree within NOMAD.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Kokkolaras .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Talgorn, B., Sébastien, L., Kokkolaras, M. (2015). Blackbox Optimization in Engineering Design: Adaptive Statistical Surrogates and Direct Search Algorithms. In: Lagaros, N., Papadrakakis, M. (eds) Engineering and Applied Sciences Optimization. Computational Methods in Applied Sciences, vol 38. Springer, Cham. https://doi.org/10.1007/978-3-319-18320-6_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-18320-6_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-18319-0

  • Online ISBN: 978-3-319-18320-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics