Advertisement

Product and Process Innovation by Integrating Physical and Simulation Experiments

  • Daniele Romano

Abstract

Technical innovation in industry can massively benefit from an investigation strategy which properly combines experiments in the field with experiments on a simulation model of the product or the process. However, a methodological frame-work for the effective integration of the two kinds of investigation is still missing. On the one hand, simulation and lab tests are routinely used together in R&D activities of hi-tech companies, although generally not in the form of statistically designed experiments. On the other hand, design of experiments and computer experiments are sound methodologies for running experiments in physical and numerical settings, respectively, but they have practically disregarded the integration issue so far. This chapter outlines a broad approach to running a sequence of physical and simulation experiments from the viewpoint of incremental system innovation. Although the approach is still qualitative, it introduces all of the elements (system innovation, model calibration, model validation and modification, building of mechanistic models) needed to tackle a new and industrially relevant problem. The approach is demonstrated through its application to the design of an engineering system and the improvement of a production process.

Keywords

Simulation Experiment Physical Experiment Pilot Plant Computer Experiment Robust Design 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aslett, R., Buck, R.J., Duvall, S.G., Sacks J., Welch W.J.: Circuit optimization via sequential computer experiments: design of a output buffer. Appl. Stat. 47, 31–48 (1998)MATHGoogle Scholar
  2. Atzori, M.: Ottimizzazione di un dispositive rampicante su pali. Master’s thesis, University of Cagliari, Cagliari (2003)Google Scholar
  3. Baldi. A., Pedone, P., Romano, D.: Design for robustness and cost effectiveness: the case of an optical profilometer. Asian J. Qual. 7(1), 98–111 (2006)Google Scholar
  4. Barbato, G., Romano, D., Zompì, A., Levi, R.: Sperimentazione Numerica per la Progettazione di Elementi Dinamometrici a Colonna. 26th AIAS Conf., Catania, Italy, pp. 327–334, 2-6 Sept. (1997)Google Scholar
  5. Bashyam, S., Fu, M.C.: Optimization of (s, S) inventory systems with random lead times and a service, level constraint. Manag. Sci. 44(12), 243–256 (1998)CrossRefGoogle Scholar
  6. Bayarri, M.J., Berger, J.O., Paulo, R., Sacks, J., Cafeo, J.A., Cavendish, J., Lin, C.-H., Tu, J.: A framework for validation of computer models. Technometrics 49(2), 138–154 (2007)CrossRefMathSciNetGoogle Scholar
  7. Bernardo, M.C., Buck, R.J., Liu, L., Nazaret, W.A., Sacks, J., Welch, W.J.: Integrated circuit design optimization using a sequential strategy. IEEE Trans. Comput.-Aided Des. 11, 361–372 (1992)CrossRefGoogle Scholar
  8. Box, G.E.P., Wilson, K.B.: On the experimental attainment of optimum conditions, J. R. Stat. Soc. B 13, 1–45 (1951)MathSciNetGoogle Scholar
  9. Box. G.E.P., Hunter, W.G., Hunter, J.S.: Statistics for Experimenters. Wiley, New York (1978)MATHGoogle Scholar
  10. Box, G.E.P.: Statistics as a catalyst to learning by scientific method, Part II-a discussion (with discussion). J. Qual. Technol. 31, 16–29 (1999)Google Scholar
  11. Craig, P.S., Goldstein, M., Seheult, A.H., Smith. J.A.: Bayes linear strategies for matching hydrocarbon reservoir history and discussion. In: Bernardo, J.M. et al. (eds.): Bayesian Statistics 5. Oxford University Press, Oxford, pp. 69–95 (1996)Google Scholar
  12. Easterling, R.G.: A framework for model validation (Technical Report SAND99-0301C). Sandia National Laboratories, Albuquerque, NM (1999)Google Scholar
  13. Easterling, R.G. Statistical foundations for model validation: two papers (SAND2003-0287). Sandia National Laboratories. Albuquerque, NM (2003)Google Scholar
  14. Giovagnoli, A., Romano, D.: Robust design via simulation experiments: a modified dual response surface approach. Qual. Reliab. Eng. Int. 24(4), 401–416 (2008)CrossRefGoogle Scholar
  15. Goldstein, M., Rougier, J.C.: Calibrated Bayesian forecasting using large computer simulators (technical report). Statistics and Probability Group, University of Durham, Durham (2003)Google Scholar
  16. Hills. R.G., Tracano, T.G.: Statistical validation of engineering and scientific models: a maximum likelihood based metric (SAND2001-1783). Sandia National Laboratories, Albuquerque, NM (2002)Google Scholar
  17. Hills, R.G., Leslie, I.: Statistical validation of engineering and scientific models: validation experiments to application (SAND2003-0706). Sandia National Laboratories, Albuquerque, NM (2003)Google Scholar
  18. Kennedy, M.C., O’Hagan, A.: Bayesian calibration of computer models. J. R. Stat. Soc. B 63(3), 425–464 (2001)CrossRefMathSciNetGoogle Scholar
  19. Lehman, J.S., Santner. T.J., Notz, W.I.: Designing computer experiments to determine robust control variables. Stat. Sinica 14, 571–590 (2004)MATHMathSciNetGoogle Scholar
  20. Manuello, A., Romano, D., Ruggiu, M.: Development of a pneumatic climbing robot by computer experiments. Ceccarelli, M. (ed.): Proc. 12th Int. Workshop on Robotics in Alpe-Adria-Danube Region, Cassino, Italy, 7-10 May (2003); available on CD-ROMGoogle Scholar
  21. Myers, R.H., Montgomery, D.C.: Response Surface Methodology, 2nd edn. Wiley, New York (2002)Google Scholar
  22. Osio, I.C., Amon, C.H.: An engineering design methodology with multistage Bayesian surrogates and optimal sampling. Res. Eng. Des. 8, 189–206 (1996)CrossRefGoogle Scholar
  23. Park, J.S.: Tuning complex computer codes to data and optimal designs. Ph.D. thesis, University of Illinois, Urbana-Champaign, IL (1991)Google Scholar
  24. Qian, Z., Seepersad, C.C., Joseph, V.R., Allen, J.K., Wu, C.F.J.: Building surrogate models based on detailed and approximate simulations (ASME Paper no. DETC2004/DAC-57486). In: Chen, W. (Ed.): ASME 30th Conf. of Design Automation, Salt Lake City, USA. ASME, New York (2004)Google Scholar
  25. Reese, C.S., Wilson, A.G., Hamada, M., Martz, H.F., Ryan, K.J.: Integrated analysis of computer and physical experiments. Technometrics 46(2), 153–164 (2004)CrossRefMathSciNetGoogle Scholar
  26. Romano, D., Vicario, G.: Reliable estimation in computer experiments on finite element codes. Qual. Eng. 14(2), 195–204 (2001–2002)CrossRefGoogle Scholar
  27. Sacks. J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4, 409–435 (1989)MATHCrossRefMathSciNetGoogle Scholar
  28. Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer-Verlag, New York (2003)MATHGoogle Scholar
  29. Taguchi, G., Wu, Y.: Introduction to Off-Line Quality Control. Central Japan Quality Control Association. Nagoya (available from American Supplier Institute. Romulus. MI, USA) (1980)Google Scholar
  30. Van Beers, W.C.M., Kleijnen, J.P.C.: Customized sequential designs for random simulation experiments: Kriging metamodeling and bootstrapping (Discussion Paper no. 55). Tilburg University, Tilburg (2005)Google Scholar
  31. Williams, B.J., Santner, T.J., Notz, W.I.: Sequential design of computer experiments to minimize integrated response fonctions. Stat. Sinica 10, 1133–1152 (2000)MATHMathSciNetGoogle Scholar

Copyright information

© Springer 2009

Authors and Affiliations

  • Daniele Romano
    • 1
  1. 1.Department of Mechanical EngineeringUniversity of CagliariCagliariItaly

Personalised recommendations