Skip to main content
Log in

Experimentation with dynamic simulation models in software engineering: planning and reporting guidelines

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Simulation-based studies (SBS) have become an interesting investigation approach for Software Engineering (SE). However, the reports on experiments with dynamic simulation models found in the technical literature lack relevant information, hampering the full understanding of the procedures and results reported, as well as their replicability. Apart from the limitations on the length in conferences and journal papers, some of the relevant information seems to be missing due to methodological issues not considered when conducting such studies. This is the case of missing research questions and goals, lack of evidence regarding the dynamic simulation model validity, poorly designed simulation experiments, amongst others. Based on findings from a previous quasi-systematic literature review, we propose a set of reporting guidelines for SBS with dynamic models in the context of SE aiming at providing guidance on which information the report should contain. Furthermore, these guidelines were evolved to support SBS planning by identifying potential threats to simulation study validity and in making recommendations to avoid them, through qualitative analysis and external evaluation. Finally, we conducted different evaluations regarding both the reporting and planning guidelines, apart from using them to support the planning of a SBS as regards software evolution. A set of 33 reporting and planning guidelines for different stages of the simulation lifecycle and focused on the experimentation with dynamic simulation models have been put together. The first assessments point to a comprehensive set of guidelines, supporting a comprehensive preparation and review of the plans and reports from the studies, apart from the planning of a SBS focused on software evolution, potentially reducing the threats to the experimentation with the validity of dynamic simulation models. The 33 guidelines cannot be understood as separate groups for reporting and planning as they overlap in many aspects. The main goal is to use the guidelines to support the planning of a simulation-based study with dynamic models so that experimenters may identify potential threats to validity and produce relevant information for a complete simulation experiment report in advance. Despite their initial contribution to increase the validity of SBS, the reporting and planning of simulation-based experiments with dynamic models still has to be discussed and improved in SE. Therefore, additional assessments of this set of guidelines are needed to strengthen the confidence in their completeness and usefulness.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. http://lens-ese.cos.ufrj.br/wikiese/index.php/Experimental_Software_Engineering_-_Glossary_of_Terms

  2. http://www.personal.psu.edu/cvm115/proposal/formulating_problem_statements.htm

  3. http://www.icsp-conferences.org/

  4. http://onlinelibrary.wiley.com/journal/10.1002/(ISSN)1099-1670

References

  • Al-Emran A, Jadallah A, Paikari E, Pfahl D, Ruhe G (2010) Application of re-estimation in re-planning of software product releases. In: Proc. of International Conference on Software Process. Paderborn, Germany

  • Alexopoulos C (2007) Statistical analysis of simulation output: state of the art. In: Proceedings of Winter Simulation Conference. doi:10.1109/WSC.2007.4419597

  • Ali NB, Petersen K (2012) A consolidated process for software process simulation: State of the art and industry experience. In: Software Engineering and Advanced Applications (SEAA), 38th EUROMICRO Conference on (pp. 327–336). IEEE

  • Andersson C, Karlsson L, Nedstam J, Höst M, Nilsson BI (2002)Understanding Software processes through system dynamics simulation: a case study. In: Proc. of the 9th Annual IEEE International Conference and Workshop on the Engineering of Computer-Based Systems (ECBS)

  • Araújo MA, Monteiro V, Travassos GH (2012) Towards a model to support in silico studies regarding software evolution. In: ESEM 2012.

  • Bai X, Huang LG, Zhang H, Koolmanojwong S (2012) Hybrid modeling and simulation for trustworthy software process management: a stakeholder-oriented approach. J Softw Evol Process 24:721

    Article  Google Scholar 

  • Balci O (1990) Guidelines for successful simulation studies. In: Proc. Winter Simulation Conference (Dec. 9–12), pp. 25–32

  • Banks J (1999) Introduction to simulation. In: Winter simulation conference, Phoenix, AZ, USA

  • Barney S, Petersen K, Svahnberg M, Aurum A, Barney H (2012) Software quality trade-offs: a systematic map. Inf Softw Technol 54(7):651–662

    Article  Google Scholar 

  • Barros MO, Werner CML, Travassos GH (2002) A system dynamics metamodel for software process modeling. Softw Process Improv Pract 7(3–4):161–172

    Article  Google Scholar 

  • Barros MO, Werner CML, Travassos GH (2004) Supporting risks in software project management. J Syst Softw 70(1–2):21–35

    Article  Google Scholar 

  • Basili VR (1992) Software modeling and measurement: the goal/question/metric paradigm. Technical report. University of Maryland at College Park, College Park, MD, USA

  • Biolchini J, Mian PG, Natali AC, Travassos GH, (2005) Systematic review in software engineering: relevance and utility. PESC-COPPE/UFRJ, Brazil. Tech. Rep. http://www.cos.ufrj.br/uploadfiles/es67905.pdf

  • Birkhölzer T, Pfahl D, Schuster M (2010) Applications of a generic work-test-rework component for software process simulation. In: Proc. of International Conference on Software Process. Paderborn, Germany

  • Burton A, Altman DG, Royston P, Holder RL (2006) The design of simulation studies in medical statistics. Stat Med 25:4279–4292

    Article  MathSciNet  Google Scholar 

  • Carver JC (2010) Towards reporting guidelines for experimental replications: a proposal. In RESER’10 (May 4), Cape Town, South Africa.

  • Concas G, Lunesu MI, Marchesi M, Zhang H (2013) Simulation of software maintenance process, with and without a work-in-process limit. J Softw Evol Process 25:1225–1248

    Article  Google Scholar 

  • Cook TD, Campbell DT (1979) Quasi-experimentation: design and analysis for field settings. Rand McNally, Chicago

    Google Scholar 

  • Corbin J, Strauss A (2008) Basics of qualitative research: techniques and procedures for developing grounded theory. Sage, Newbury Park

    Google Scholar 

  • Davis JP, Eisenhardt KM, Bingham CB (2007) Developing theory through simulation methods. Acad Manag Rev 32(2):480–499

    Article  Google Scholar 

  • de França BBN, Travassos GH (2012) Reporting guidelines for simulation-based studies in software engineering. In: Proc 16th EASE (Ciudad Real, Spain, May 14–15). IET, 156–160

  • de França BBN, Travassos GH (2013a) Are we prepared for simulation based studies in software engineering yet? CLEI electronic journal, 16:1:8. Available at: http://www.clei.cl/cleiej/papers/v16i1p8.pdf

  • de França BBN, Travassos GH (2013b) Reporting guidelines for simulation-based studies in software engineering. Technical Report RT-ES 746/13. Available at: http://www.cos.ufrj.br/uploadfile/1368206472.pdf

  • de França BBN, Travassos GH (2014a) Reporting guidelines for simulation-based studies in software engineering. Technical Report RT-ES 747/14. Available at: http://www.cos.ufrj.br/uploadfile/1409314364.pdf

  • de França BBN, Travassos GH (2014b) Simulation based studies in software engineering: a matter of validity. In: CIbSE/ESELAW. April. Pucón, Chile

  • de Mello RM, da SILVA, PC, Runeson, P, Travassos, GH (2014) Towards a framework to support large-scale sampling in software engineering surveys. In: Proc. of the 8th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM ’14). ACM, New York, NY, USA, Article 48, 4 pages. doi:10.1145/2652524.2652567.

  • Dybå T, Sjøberg DIK, Cruzes DS (2012) What works for whom, where, when, and why? On the role of context in empirical software engineering. In: ESEM’12. Sep 19–20, Lund, Sweden

  • Eck JE, Liu L (2008) Contrasting simulated and empirical experiments in crime prevention. J Exp Criminol 4:195–213

    Article  Google Scholar 

  • Florac WA, Carleton AD (1999) Measuring the software process. Addison-Wesley, Reading

    Google Scholar 

  • Foss T, Stensrud E, Kitchenham B, Myrtveit I (2003) A simulation study of the model evaluation criterion MMRE. IEEE Trans Softw Eng 29(11):985–995, November

    Article  Google Scholar 

  • Garousi V, Khosrovian K, Pfahl D (2009) A customizable pattern-based software process simulation model: design, calibration and application. SPIP 14:165–180

    Google Scholar 

  • Grimm V, Berger U, DeAngelis DL, Polhill JG, Giske J, Railsback SF (2010) The ODD protocol: a review and first update. Ecol Model 221(23):2760–2768

    Article  Google Scholar 

  • Houston DX, Buettner DJ (2013) Modeling user story completion of an agile software process. In: Proc. of ICSSP’13, May 18–19. San Francisco, CA, USA

  • Houston D, Lieu M (2010) Modeling a resource-constrained test-and-fix cycleand test stage duration. In: Proc. of International Conference on Software Process. Paderborn, Germany

  • Houston DX, Ferreira S, Collofello JS, Montgomery DC, Mackulak GT, Shunk DL (2001) Behavioural characterization: finding and using the influential factors in software process simulation models. J Syst Softw 59:259–270

    Article  Google Scholar 

  • Ivarsson M, Gorschek T (2011) A method for evaluating rigor and industrial relevance of technology evaluations. Empir Softw Eng 16(3):365–395

    Article  Google Scholar 

  • Jedlitschka A, Ciolkowski M, Pfahl D (2008) Reporting experiments in software engineering. In: Shull F et al (eds) Guide to advanced empirical software engineering. Springer, New York

    Google Scholar 

  • Kitchenham B, Travassos GH, Mayrhauser A, Niessink F, Schneidewind NF, Singer J, Takada S, Vehvilainen R, Yang H (1999) Towards an ontology of software maintenance. JSMRP 11:365–389

    Google Scholar 

  • Kitchenham B, Pfleeger SL, Hoaglin DC, El Emam K, Rosenberg J (2002) Preliminary guidelines for empirical research in software engineering. IEEE Trans Softw Eng 28:721–734

    Article  Google Scholar 

  • Kitchenham BA, Al-Kilidar H, Babar MA, Berry M, Cox K, Keung J, Kurniawati F, Staples M, Zhang H, Zhu L (2008) Evaluating guidelines for reporting empirical software engineering studies. Empir Softw Eng 13(1):97–121

    Article  Google Scholar 

  • Kleijnen JPC (1975) Statistical design and analysis of simulation experiments. Informatie 17(10):531–535

    MathSciNet  Google Scholar 

  • Kleijnen JPC, Sanchez SM, Lucas TW, Cioppa TM (2005) State-of-the-art review: a user’s guide to the brave new world of designing simulation experiments. INFORMS J Comput 17(3):263–289. doi:10.1287/ijoc.1050.0136

    Article  MATH  Google Scholar 

  • Montgomery DC (2008) Design and analysis of experiments. Wiley, New York

    Google Scholar 

  • Müller M, Pfahl D (2008) Simulation methods. In: Shull F, Singer J, Sjøberg DIK (eds) Guide to advanced empirical software engineering, section I. Springer, New York, pp 117–152

    Chapter  Google Scholar 

  • Ören TI (1981) Concepts and criteria to assess acceptability of simulation studies: a frame of reference. Simul Model Stat Comput 24(4):180–189

    Google Scholar 

  • Pai M, McCulloch M, Gorman JD (2004) Systematic reviews and meta-analyses: an illustrated, step-by-step guide. Natl Med J India 17:2

    Google Scholar 

  • Paikari E, Ruhe G, Southekel PH (2012) Simulation-based decision support for bringing a project back on track: the case of RUP-based software construction. In: Proc. of International Conference on Software and System Process. Zürich, Switzerland

  • Petersen K (2011) Measuring and predicting software productivity: a systematic map and review. Inf Softw Technol 53(4):317–343

    Article  Google Scholar 

  • Pfahl D, Ruhe G (2002) IMMoS: a methodology for integrated measurement, modelling and simulation. Softw Process Improv Pract 7:189–210

    Article  Google Scholar 

  • Psaroudakis JE, Eberhardt A (2011) A discrete event simulation model to evaluate changes to a software project delivery process. In: IEEE Conference on Commerce and Enterprise Computing, pp. 113–120

  • Raffo D (2005) Software project management using PROMPT: a hybrid metrics, modeling and utility framework. IST 47:1009–1017

    Google Scholar 

  • Rahmandad H, Sterman JD (2012) Reporting guidelines for simulation‐based research in social sciences. Syst Dyn Rev 28(4):396–411

    Article  Google Scholar 

  • Runeson P, Höst M (2009) Guidelines for conducting and reporting case study research in software engineering. Empir Softw Eng 14:131–164

    Article  Google Scholar 

  • Sargent RG (1999) Validation and verification of simulation models. In: Winter simulation conference

  • Shannon RE (1998) Introduction to the art and science of simulation. In: Medeiros DJ, Watson EF, Carson JS, Manivannan MS (eds) Proceedings of the 1998 Winter Simulation Conference

  • Sterman JD (2000) Business dynamics: systems thinking and modeling for a complex world. Irwin/McGraw-Hill, Boston

    Google Scholar 

  • Thomke S (2003) Experimentation matters: unlocking the potential of new technologies for innovation. Harvard Business School Press, Boston

    Google Scholar 

  • Travassos GH, Barros MO (2003) Contributions of in virtuo and in silico experiments for the future of empirical studies in software engineering. In: WSESE03, Fraunhofer IRB Verlag, Rome

  • Uzzafer M (2013) A simulation model for strategic management process of software projects. J Syst Softw 86:21–37

    Article  Google Scholar 

  • Wakeland WW, Martin RH, Raffo D (2004) Using design of experiments, sensitivity analysis, and hybrid simulation to evaluate changes to a software development process: a case study. Softw Process Improv Pract 9:107–119

    Article  Google Scholar 

  • Wöhlin C, Runeson P, Host M, Ohlsson C, Regnell B, Wesslén A (2012) Experimentation in software engineering: an introduction. Springer, New York

    Book  MATH  Google Scholar 

  • Yin RK (2008) Case study research: design and methods, vol 5. SAGE Publications, Newbury Park

    Google Scholar 

  • Zhang H, Klein G, Staples M, Andronick J, Zhu L, Kolanski R (2012) Simulation modeling of a large-scale formal verification process. In: ICSSP 2012, Zürich, Switzerland

Download references

Acknowledgments

The authors would like to thank the CNPq (Grants 141152/2010-9 and 304795/2010-0) for their support of this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Breno Bernard Nicolau de França.

Additional information

Communicated by: Maurizio Morisio

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

de França, B.B.N., Travassos, G.H. Experimentation with dynamic simulation models in software engineering: planning and reporting guidelines. Empir Software Eng 21, 1302–1345 (2016). https://doi.org/10.1007/s10664-015-9386-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-015-9386-4

Keywords

Navigation