Design and Analysis of Simulation Experiments

  • Jack P. C. Kleijnen
Conference paper
Part of the Springer Proceedings in Mathematics & Statistics book series (PROMS, volume 231)


This contribution summarizes the design and analysis of experiments with computerized simulation models. It focuses on two metamodel (surrogate, emulator) types, namely first-order or second-order polynomial regression, and Kriging (or Gaussian process). The metamodel type determines the design of the simulation experiment, which determines the input combinations of the simulation model. Before applying these metamodels, the analysts should screen the many inputs of a realistic simulation model; this contribution focuses on sequential bifurcation. Optimization of the simulated system may use either a sequence of first-order and second-order polynomials—so-called response surface methodology (RSM)—or Kriging models fitted through sequential designs—including efficient global optimization (EGO). Robust optimization accounts for uncertainty in some simulation inputs.


Robustness and sensitivity Metamodel Design Regression Kriging 



I thank the editors for inviting me to write a contribution for this book and W. Shi (Hubei University of Economics, Wuhan, China) for commenting on Sect. 1.4.


  1. 1.
    Ankenman, B., Nelson, B., Staum, J.: Stochastic Kriging for simulation metamodeling. Oper. Res. 58(2), 371–382 (2010)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Ben-Tal, A., El Ghaoui, L., Nemirovski, A.: Robust Optimization. Princeton University Press, Princeton (2009)CrossRefGoogle Scholar
  3. 3.
    Bischl, B., Mersmann, O., Trautmann, H., Weihs, C.: Resampling methods for meta-model validation with recommendations for evolutionary computation. Evol. Comput. 20(2), 249–275 (2012)CrossRefGoogle Scholar
  4. 4.
    Borgonovo, E., Plischke, E.: Sensitivity analysis: a review of recent advances. Eur. J. Oper. Res. 248(3), 869–887 (2016)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Chang, K.-H., Li, M.-K., Wan, H.: Combining STRONG with screening designs for large-scale simulation optimization. IIE Trans. 46(4), 357–373 (2014)CrossRefGoogle Scholar
  6. 6.
    Chevalier, C., Ginsbourger, D., Bect, J., Vazquez, E., Picheny, V., Richet, Y.: Fast parallel Kriging-based stepwise uncertainty reduction with application to the identification of an excursion set. Technometrics 56(4), 455–465 (2014)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Dellino, G., Kleijnen, J.P.C., Meloni, C.: Robust optimization in simulation: Taguchi and Krige combined. INFORMS J. Comput. 24(3), 471–484 (2012)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Gordy, M.B., Juneja, S.: Nested simulation in portfolio risk measurement. Manag. Sci. 56(11), 1833–1848 (2010)CrossRefGoogle Scholar
  9. 9.
    Jalali, H., Van Nieuwenhuyse, I.: Simulation optimization in inventory replenishment: a classification. IIE Transactions (2015) (Accepted)Google Scholar
  10. 10.
    Kamiński, B.: A method for updating of stochastic Kriging metamodels. Eur. J. Oper. Res. 247(3), 859–866 (2015)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Khuri, A.I., Mukhopadhyay, S.: Response surface methodology. Wiley Interdiscip. Rev. Comput. Stat. 2, 128–149 (2010)CrossRefGoogle Scholar
  12. 12.
    Kleijnen, J.P.C.: Design and Analysis of Simulation Experiments, 2nd edn. Springer, Berlin (2015)CrossRefGoogle Scholar
  13. 13.
    Kleijnen, J.P.C.: Comment on Park et al. “Robust Kriging in computer experiments”. J. Oper. Res. Soc. (2016) (in press)Google Scholar
  14. 14.
    Kleijnen, J.P.C.: Regression and Kriging metamodels with their experimental designs in simulation: a review. Eur. J. Oper. Res. 256, 1–16 (2017)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Kleijnen, J.P.C., Shi, W.: Sequential probability ratio tests for nonnormal simulation responses. Tilburg University, Discussion Paper (2017)Google Scholar
  16. 16.
    Kleijnen, J.P.C., Pierreval, H., Zhang, J.: Methodology for determining the acceptability of system designs in uncertain environments. Eur. J. Oper. Res. 209(2), 176–183 (2011)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Law, A.M.: Simulation Modeling and Analysis, 5th edn. McGraw-Hill, Boston (2015)Google Scholar
  18. 18.
    Loeppky, J.L., Sacks, J., Welch, W.: Choosing the sample size of a computer experiment: a practical guide. Technometrics 51(4), 366–376 (2009)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Lophaven, S.N., Nielsen, H.B., Sondergaard, J.: DACE: a Matlab Kriging toolbox, version 2.0. IMM Technical University of Denmark, Kongens Lyngby (2002)Google Scholar
  20. 20.
    Maatouk, H., Bay, X.: Gaussian process emulators for computer experiments with inequality constraints (2016). arXiv:1606.01265v1
  21. 21.
    Markiewicz, A., Szczepańska, A.: Optimal designs in multivariate linear models. Stat. Probab. Lett. 77, 426–430 (2007)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Myers, R.H., Montgomery, D.C., Anderson-Cook, C.M.: Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 3rd edn. Wiley, New York (2009)zbMATHGoogle Scholar
  23. 23.
    Naumov, V., Gaidamaka, Y., Samouylov, K., Sopin, E., Samuylov, A.: Multiserver queue with finite resources and customers of random volume. In: Moder, K., Melas, V., Pilz, J., Rasch, D. (eds.) Statistics and Simulation. Springer, Berlin (2018)Google Scholar
  24. 24.
    Nelson, B.L.: ‘Some tactical problems in digital simulation’ for the next 10 years. J. Simul. 10, 2–11 (2016)CrossRefGoogle Scholar
  25. 25.
    Praskova, Z.: Bootstrap change point for dependent data. In: Moder, K., Melas, V., Pilz, J., Rasch, D. (eds.) Statistics and Simulation. Springer, Berlin (2018)Google Scholar
  26. 26.
    Tan, M.H.Y.: Monotonic metamodels for deterministic computer experiments. Technometrics 59(1), 1–10 (2017)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Vollert, N., Ortner, M., Pilz, J.: Benefits and application of tree structures in Gaussian process models to optimize magnetic field shaping problems. In: Moder, K., Melas, V., Pilz, J., Rasch, D. (eds.) Statistics and Simulation. Springer, Berlin (2018)Google Scholar
  28. 28.
    Wan, H., Ankenman, B.E., Nelson, B.L.: Improving the efficiency and efficacy of controlled sequential bifurcation for simulation factor screening. INFORMS J. Comput. 22(3), 482–492 (2010)MathSciNetCrossRefGoogle Scholar
  29. 29.
    Woods, D.C., Lewis, S.M.: Design of experiments for screening (2015). arXiv:1510.05248Google Scholar
  30. 30.
    Yanikoğlu, İ., den Hertog, D., Kleijnen, J.P.C.: Adjustable robust parameter design with unknown distributions. IIE Trans. 48(3), 298–312 (2016)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Tilburg UniversityTilburgNetherlands

Personalised recommendations