Action Research vs. Design Research

  • Miroslaw Staron


Action research is one of many research methodologies used in contemporary empirical software engineering. Its practical orientation and embedding in the context of a company are its main appeal. However, the embedding can be challenging as it requires active participation from industrial partners. Therefore, we can sometimes change the course of our studies and use a methodology that is closely related—design science research. In this chapter, we explore the basic principles of design science research and make the comparison between these two methodologies.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [BF+14]
    Pierre Bourque, Richard E Fairley, et al. Guide to the software engineering body of knowledge (SWEBOK (R)): Version 3.0. IEEE Computer Society Press, 2014.Google Scholar
  2. [CH11]
    Koen Claessen and John Hughes. Quickcheck: A lightweight tool for random testing of Haskell programs. SIGPLAN Not., 46(4):53–64, May 2011.CrossRefGoogle Scholar
  3. [CR08]
    Lan Cao and Balasubramaniam Ramesh. Agile requirements engineering practices: An empirical study. IEEE software, 25(1):60–67, 2008.CrossRefGoogle Scholar
  4. [GGLW06]
    Tony Gorschek, Per Garre, Stig Larsson, and Claes Wohlin. A model for technology transfer in practice. IEEE software, 23(6):88–95, 2006.CrossRefGoogle Scholar
  5. [GO79]
    Amrit L Goel and Kazu Okumoto. Time-dependent error-detection rate model for software reliability and other performance measures. IEEE transactions on Reliability, 28(3):206–211, 1979.Google Scholar
  6. [IV09]
    Juhani Iivari and John Venable. Action research and design science research-seemingly similar but decisively dissimilar. In ECIS, pages 1642–1653, 2009.Google Scholar
  7. [Jär07]
    Pertti Järvinen. Action research is similar to design science. Quality & Quantity, 41(1):37–54, 2007.CrossRefGoogle Scholar
  8. [LFW15]
    Per Lenberg, Robert Feldt, and Lars Göran Wallgren. Human factors related challenges in software engineering: an industrial perspective. In Proceedings of the Eighth International Workshop on Cooperative and Human Aspects of Software Engineering, pages 43–49. IEEE Press, 2015.Google Scholar
  9. [MO84]
    John D Musa and Kazuhira Okumoto. A logarithmic poisson execution time model for software reliability measurement. In Proceedings of the 7th international conference on Software engineering, pages 230–238. Citeseer, 1984.Google Scholar
  10. [OOD09]
    Koji Ohishi, Hiroyuki Okamura, and Tadashi Dohi. Gompertz software reliability model: Estimation algorithm and empirical validation. Journal of Systems and Software, 82(3):535–543, 2009.CrossRefGoogle Scholar
  11. [Pha03]
    Hoang Pham. Software reliability and cost models: Perspectives, comparison, and practice. European Journal of Operational Research, 149(3):475–489, 2003.MathSciNetCrossRefGoogle Scholar
  12. [RSM+13]
    Rakesh Rana, Miroslaw Staron, Niklas Mellegård, Christian Berger, Jörgen Hansson, Martin Nilsson, and Fredrik Törner. Evaluation of standard reliability growth models in the context of automotive software systems. In International Conference on Product Focused Software Process Improvement, pages 324–329. Springer, 2013.Google Scholar
  13. [SC17]
    Anna Börjesson Sandberg and Ivica Crnkovic. Meeting industry: academia research collaboration challenges with agile methodologies. In Proceedings of the 39th International Conference on Software Engineering: Software Engineering in Practice Track, pages 73–82. IEEE Press, 2017.Google Scholar
  14. [SKT05]
    Miroslaw Staron, Ludwik Kuzniarz, and Christian Thurn. An empirical assessment of using stereotypes to improve reading techniques in software inspections. ACM SIGSOFT Software Engineering Notes, 30(4):1–7, 2005.CrossRefGoogle Scholar
  15. [SM08]
    Miroslaw Staron and Wilhelm Meding. Predicting weekly defect inflow in large software projects based on project planning and test status. Information and Software Technology, 50(7–8):782–796, 2008.CrossRefGoogle Scholar
  16. [SMS10]
    Miroslaw Staron, Wilhelm Meding, and Bo Söderqvist. A method for forecasting defect backlog in large streamline software development projects and its industrial evaluation. Information and Software Technology, 52(10):1069–1079, 2010.CrossRefGoogle Scholar
  17. [ST09]
    Paulo Sergio Medeiros dos Santos and Guilherme Horta Travassos. Action research use in software engineering: An initial survey. In Proceedings of the 2009 3rd International Symposium on Empirical Software Engineering and Measurement, pages 414–417. IEEE Computer Society, 2009.Google Scholar
  18. [Wie10]
    Roel Wieringa. Design science methodology: principles and practice. In 2010 ACM/IEEE 32nd International Conference on Software Engineering, volume 2, pages 493–494. IEEE, 2010.Google Scholar
  19. [Wie14]
    Roel J Wieringa. Design science methodology for information systems and software engineering. Springer, 2014.Google Scholar
  20. [YOO83]
    Shigeru Yamada, Mitsuru Ohba, and Shunji Osaki. S-shaped reliability growth modeling for software error detection. IEEE Transactions on reliability, 32(5):475–484, 1983.CrossRefGoogle Scholar
  21. [YTO92]
    Shigeru Yamada, Koichi Tokuno, and Shunji Osaki. Imperfect debugging models with fault introduction rate for software reliability assessment. International Journal of Systems Science, 23(12):2241–2252, 1992.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Miroslaw Staron
    • 1
  1. 1.Department of Computer Science and EngineeringUniversity of GothenburgGothenburgSweden

Personalised recommendations