Skip to main content

Improving Software Engineering Research Through Experimentation Workbenches

  • Chapter
  • First Online:
From Software Engineering to Formal Methods and Tools, and Back

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11865))

Abstract

Experimentation with software prototypes plays a fundamental role in software engineering research. In contrast to many other scientific disciplines, however, explicit support for this key activity in software engineering is relatively small. While some approaches to improve this situation have been proposed by the software engineering community, experiments are still very difficult and sometimes impossible to replicate.

In this paper, we propose the concept of an experimentation workbench as a means of explicit support for experimentation in software engineering research. In particular, we discuss core requirements that an experimentation workbench should satisfy in order to qualify as such and to offer a real benefit for researchers. Beyond their core benefits for experimentation, we stipulate that experimentation workbenches will also have benefits in regard to reproducibility and repeatability of software engineering research. Further, we illustrate this concept with a scenario and a case study, and describe relevant challenges as well as our experience with experimentation workbenches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Available at GitHub: https://github.com/KernelHaven/KernelHaven.

  2. 2.

    Available at GitHub: https://github.com/KernelHaven/MetricHaven.

  3. 3.

    In the case of minor variations, of course, also variations of existing plugins or even parameterized instances can be used. In order to support this a parametrization approach for plugins exists.

  4. 4.

    http://www.revamp2-project.eu/.

  5. 5.

    https://sse.uni-hildesheim.de/en/research/projects/easy-producer/.

References

  1. Association for Computing Machinery: Artifact review and badging (2018). http://www.acm.org/publications/policies/artifact-review-badging. Accessed 03 May 2019

  2. Baker, M.: Over half of psychology studies fail reproducibility test. News article in Nature - International Weekly Journal of Science (2015). https://www.nature.com/news/over-half-of-psychology-studies-fail-reproducibility-test-1.18248. Accessed 03 May 2019

  3. Boettiger, C.: An introduction to docker for reproducible research. ACM SIGOPS Operating Syst. Rev. 49(1), 71–79 (2015)

    Article  Google Scholar 

  4. Bosch, J.: From software product lines to software ecosystems. In: 13th International Software Product Line Conference (SPLC 2009), pp. 111–119 (2009)

    Google Scholar 

  5. Brummermann, H., Keunecke, M., Schmid, K.: Formalizing distributed evolution of variability in information system ecosystems. In: 6th International Workshop on Variability Modelling of Software-Intensive Systems (VaMoS 2012), pp. 11–19 (2012)

    Google Scholar 

  6. CADOS / VAMOS Team: Undertaker (2015). https://vamos.informatik.uni-erlangen.de/trac/undertaker. Accessed 03 May 2019

  7. Dyer, R., Nguyen, H.A., Rajan, H., Nguyen, T.N.: Boa: A language and infrastructure for analyzing ultra-large-scale software repositories. In: 35th International Conference on Software Engineering (ICSE 2013), pp. 422–431 (2013)

    Google Scholar 

  8. Eichelberger, H., Sass, A., Schmid, K.: From reproducibility problems to improvements: a journey. In: Symposium on Software Performance (SSP 2016), Softwaretechnik-Trends, vol. 36, no. 4, pp. 43–45 (2016)

    Google Scholar 

  9. Eide, E., Stoller, L., Lepreau, J.: An experimentation workbench for replayable networking research. In: 4th USENIX Conference on Networked Systems Design & Implementation (NSDI 2007), pp. 16–16 (2007)

    Google Scholar 

  10. El-Sharkawy, S., Dhar, S.J., Krafczyk, A., Duszynski, S., Beichter, T., Schmid, K.: Reverse engineering variability in an industrial product line: observations and lessons learned. In: 22nd International Systems and Software Product Line Conference (SPLC 2018), vol. 1, pp. 215–225 (2018)

    Google Scholar 

  11. El-Sharkawy, S., Krafczyk, A., Schmid, K.: MetricHaven – more than 23,000 metrics for measuring quality attributes of software product lines. In: 23rd International Systems and Software Product Line Conference, SPLC 2019, vol. B. ACM (2019). Accepted

    Google Scholar 

  12. El-Sharkawy, S., Yamagishi-Eichler, N., Schmid, K.: Metrics for analyzing variability and its implementation in software product lines: A systematic literature review. Inf. Softw. Technol. 106, 1–30 (2019)

    Article  Google Scholar 

  13. Ferrari, A., Dell’Orletta, F., Esuli, A., Gervasi, V., Gnesi, S.: Natural language requirements processing: a 4D vision. IEEE Softw. 34(6), 28–35 (2017)

    Article  Google Scholar 

  14. Flöter, M.: Prototypical realization and validation of an incremental software product line analysis approach. Master thesis, University of Hildesheim (2018)

    Google Scholar 

  15. Gnesi, S., Ferrari, A.: Research on NLP for RE at CNR-ISTI: a report. In: 1st Workshop on Natural Language Processing for Requirements Engineering, vol. 4, pp. 1–5 (2018)

    Google Scholar 

  16. Gnesi, S., Trentanni, G.: QuARS: A NLP tool for requirements analysis. In: 2nd Workshop on Natural Language Processing for Requirements Engineering and NLP Tool Showcase, 1, pp. 1–5 (2019). Tool Demonstrations

    Google Scholar 

  17. GrĂĽner, S., et al.: Demonstration of tool chain for feature extraction, analysis and visualization on an industrial case study. In: 17th IEEE International Conference on Industrial Informatics (INDIN 2019) (2019). Accepted

    Google Scholar 

  18. Kästner, C.: TypeChef (2013). https://ckaestne.github.io/TypeChef/. Accessed 03 May 2019

  19. Krafczyk, A., El-Sharkawy, S., Schmid, K.: Reverse engineering code dependencies: converting integer-based variability to propositional logic. In: 22nd International Systems and Software Product Line Conference (SPLC 2018), vol. 2, pp. 34–41 (2018)

    Google Scholar 

  20. Kröher, C., El-Sharkawy, S., Schmid, K.: Kernelhaven – an experimentation workbench for analyzing software product lines. In: 40th International Conference on Software Engineering: Companion Proceedings (ICSE 2018), pp. 73–76 (2018)

    Google Scholar 

  21. Kröher, C., El-Sharkawy, S., Schmid, K.: Kernelhaven - an open infrastructure for product line analysis. In: 22nd International Systems and Software Product Line Conference (SPLC 2018), vol. 2, pp. 5–10 (2018)

    Google Scholar 

  22. Kröher, C., Gerling, L., Schmid, K.: Identifying the intensity of variability changes in software product line evolution. In: 22nd International Systems and Software Product Line Conference (SPLC 2018), vol. 1, pp. 54–64 (2018)

    Google Scholar 

  23. van der Linden, F., Schmid, K., Rommes, E.: Software Product Lines in Action: The Best Industrial Practice in Product Line Engineering, 1st edn, 333 pp. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-71437-8

    Book  Google Scholar 

  24. Nadi, S., Berger, T., Kästner, C., Czarnecki, K.: Where do configuration constraints stem from? An extraction approach and an empirical study. IEEE Trans. Softw. Eng. 41(8), 820–841 (2015)

    Article  Google Scholar 

  25. Oliveira, A., Petkovich, J.C., Reidemeister, T., Fischmeister, S.: Datamill: rigorous performance evaluation made easy. In: 4th ACM/SPEC International Conference on Performance Engineering (ICPE 2013), pp. 137–149 (2013)

    Google Scholar 

  26. Project Jupyter: The Jupyter Notebook (2019). http://jupyter.org. Accessed 03 May 2019

  27. Schmid, K.: Variability modeling for distributed development – a comparison with established practice. In: 14th International Conference on Software Product Line Engineering (SPLC 2010), pp. 155–165 (2010)

    Chapter  Google Scholar 

  28. Schmid, K., Eichelberger, H.: Easy-producer: from product lines to variability-rich software ecosystems. In: 19th International Conference on Software Product Line (SPLC 2015), pp. 390–391 (2015)

    Google Scholar 

  29. Sim, S.E., Easterbrook, S.M., Holt, R.C.: Using benchmarking to advance research: a challenge to software engineering. In: 25th International Conference on Software Engineering (ICSE 2003), pp. 74–83 (2003)

    Google Scholar 

  30. srcML Team: srcML (2017). http://www.srcml.org/. Accessed 03 May 2019

  31. Tartler, R., Lohmann, D., Sincero, J., Schröder-Preikschat, W.: Feature consistency in compile-time-configurable system software: facing the Linux 10,000 feature problem. In: 6th Conference on Computer Systems (EuroSys 2011), pp. 47–60 (2011)

    Google Scholar 

  32. The Eclipse Foundation: Eclipse IDE (2019). https://www.eclipse.org/. Accessed 03 May 2019

  33. The R Foundation: R Project (2019). https://www.r-project.org/. Accessed 03 May 2019

  34. Thüm, T., Apel, S., Kästner, C., Schaefer, I., Saake, G.: A classification and survey of analysis strategies for software product lines. ACM Computing Surveys 47(1), p. 45 (2014). Article 6

    Article  Google Scholar 

Download references

Acknowledgements

This work is partially supported by the ITEA3 project REVaMP\(^2\), funded by the BMBF (German Ministry of Research and Education) under grant 01IS16042H. Any opinions expressed herein are solely by the authors and not by the BMBF.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Klaus Schmid , Sascha El-Sharkawy or Christian Kröher .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Schmid, K., El-Sharkawy, S., Kröher, C. (2019). Improving Software Engineering Research Through Experimentation Workbenches. In: ter Beek, M., Fantechi, A., Semini, L. (eds) From Software Engineering to Formal Methods and Tools, and Back. Lecture Notes in Computer Science(), vol 11865. Springer, Cham. https://doi.org/10.1007/978-3-030-30985-5_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30985-5_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30984-8

  • Online ISBN: 978-3-030-30985-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics