Robust Solutions in Engineering Design: stochastic simulation versus DACE

  • R. A. Bates
  • H. P. Wynn


This paper compares two different methods for robust design improvement. The first method, called stochastic simulation, combines traditional ComputerAided Engineering (CAE) simulation tools with variation in the simulation model parameters in order to estimate the resulting uncertainty in system behaviour for design improvement. The second method, called DACE, employs traditional Design of Experiments (DOE) methodologies to build statistical models of CAE simulation tools, called emulators because they emulate the behaviour of the simulator. The emulators are much faster to compute than the corresponding simulation model and can therefore be used to search the design space for robust solutions in an efficient way.

The two methods can therefore be characterized by their computational cost, flexibility and accuracy. Two example problems are used to highlight the methods and their advantages. The use of measures of variation in responses is carried forward to be included in multi-objective optimization, so that robustness is naturally considered as a design objective.


Root Mean Square Error Design Factor Noise Factor Robust Solution Design Improvement 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    N F O Evbuomwan, S Sivaloganathan, and A Jebb. A survey of design philosophies, models, methods and systems. Proc. Inst. Mech. Engrs. Part B, 210, 1996.Google Scholar
  2. [2]
    Raymond H Myers, Douglas C Montgomery, G Geoffrey Vining, Connie M Borror, and Scott M Kowalski. Response surface methodology: A retrospective and literature survey. Journal of Quality Technology, 36(1):53–77, Jan 2004.Google Scholar
  3. [3]
    D Romano, M Varetto, and G Vicario. Multiresponse robust design: A general framework based on combined array. Journal of Quality Technology, 36(1):27–37, Jan 2004.Google Scholar
  4. [4]
    R A Bates, B Giglio, and H P Wynn. A global selection procedure for polynomial interpolators. Technometrics, 45(3):246–255, August 2003.MathSciNetCrossRefGoogle Scholar
  5. [5]
    R A Bates and H P Wynn. Advanced polynomial emulation for robus engineering design, pages 29–34. MCB University Press, 2002.Google Scholar
  6. [6]
    R S Kenett and S Zacks. Modern Industrial Statistics. Duxbury Press, 1998.Google Scholar
  7. [7]
    M D McKay, W J Conover, and R J Beckman. A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics, 21:239–245, 1979.MathSciNetMATHGoogle Scholar
  8. [8]
    J Sacks, W J Welch, T J Mitchell, and H P Wynn. Design and analysis of computer experiments. Statistical Science, 4:409–435, November 1989.MathSciNetCrossRefMATHGoogle Scholar
  9. [9]
    P L Yu. Multiple Criteria Decision Making: Concepts, techniques and Extensions. Plenum Press, New York, 1985.MATHGoogle Scholar
  10. [10]
    L Pronzato, E Walter, A Venot, and J F Lebruchec. A general-purpose global optimizer-implementation and applications. Mathematics and fcComputers in Simulation, 26(5):412–422, 1984.MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2004

Authors and Affiliations

  • R. A. Bates
    • 1
  • H. P. Wynn
  1. 1.Decision Support and Risk GroupLondon School of Economics and Political ScienceLondonUK

Personalised recommendations