Advertisement

We Need the Open Artefact: Design Science as a Pathway to Open Science in Information Systems Research

  • Cathal DoyleEmail author
  • Markus Luczak-Roesch
  • Abhinav Mittal
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11491)

Abstract

Design science research (DSR) is facing some significant challenges such as how to make the knowledge and artefacts we create more accessible; exclusion from competitive funding schemes that require open practices; and a potential reproducibility crisis if scholars do not have access to everything needed to repeat past research. To help tackle these challenges we suggest that the community should strongly engage with open science, which has been growing in prominence in other fields in recent years. A review of current DSR literature suggests that researchers have not yet discussed how open science practices can be adopted within the field. Thus, we propose how the concepts of open science, namely open access, open data, open source, and open peer review, can be mapped to a DSR process model. Further, we identify an emerging concept, the open artefact, which provides an opportunity to make artefacts more accessible to practice and scholars. The aim of this paper is to stimulate a discussion amongst researchers about these open science practices in DSR, and whether it is a necessary step forward to keep the pace of the changing academic environment.

Keywords

Design science research Open science Open access Open data Open source Open peer review Open artefact DSR process model 

References

  1. 1.
    Baskerville, R.: What design science is not. Eur. J. Inf. Syst. 17(5), 441–443 (2008)CrossRefGoogle Scholar
  2. 2.
    Nagle, T., Sammon, D., Doyle, C.: Insights into practitioner design science research. In: Maedche, A., vom Brocke, J., Hevner, A. (eds.) DESRIST 2017. LNCS, vol. 10243, pp. 414–428. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-59144-5_25CrossRefGoogle Scholar
  3. 3.
    Hevner, A.: A three cycle view of design science research. Scand. J. Inf. Syst. 19(2), 87–92 (2007)Google Scholar
  4. 4.
    Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A design science research methodology for information systems research. J. Manage. Inf. Syst. 24(3), 45–77 (2007)CrossRefGoogle Scholar
  5. 5.
    Nagle, T., Sammon, D., Doyle, C.: Meeting in the middle: bridging the practice research divide from both sides. In: ECIS, Research Paper, p. 158 (2016)Google Scholar
  6. 6.
    Wellcome open access policy. https://wellcome.ac.uk/funding/guidance/open-access-policy. Accessed 10 Feb 2019
  7. 7.
    Bill & Melinda Gates Foundation open access policy frequently asked questions. https://www.gatesfoundation.org/How-We-Work/General-Information/Open-Access-Policy/Page-2. Accessed 10 Feb 2019
  8. 8.
    Tennant, J.P., et al.: A multi-disciplinary perspective on emergent and future innovations in peer review. F1000Res. 6, 1151 (2017)CrossRefGoogle Scholar
  9. 9.
    Tennant, J.P., Waldner, F., Jacques, D.C., Masuzzo, P., Collister, L.B., Hartgerink, C.H.: The academic, economic and societal impacts of open access: an evidence-based review. F1000Res. 5, 632 (2016)CrossRefGoogle Scholar
  10. 10.
    Garcia-Penalvo, F.J., García de Figuerola, C., Merlo, J.A.: Open knowledge: challenges and facts. Online Inf. Rev. 34(4), 520–539 (2010)CrossRefGoogle Scholar
  11. 11.
    Demchenko, Y., Zhao, Z., Grosso, P., Wibisono, A., De Laat, C.: Addressing big data challenges for scientific data infrastructure. In: 2012 IEEE 4th International Conference on Cloud Computing Technology and Science (CloudCom), pp. 614–617. IEEE (2012)Google Scholar
  12. 12.
    De Roure, D., et al.: myExperiment: defining the social virtual research environment. In: IEEE Fourth International Conference on eScience, 2008. eScience 2008, pp. 182–189. IEEE (2008)Google Scholar
  13. 13.
    Munafò, M.R., et al.: A manifesto for reproducible science. Nat. Hum. Behav. 1(1), 0021 (2017)CrossRefGoogle Scholar
  14. 14.
    Pontika, N., Knoth, P., Cancellieri, M., Pearce, S.: Fostering open science to research using a taxonomy and an eLearning portal. In: Proceedings of the 15th International Conference on Knowledge Technologies and Data-driven Business, p. 11. ACM (2015)Google Scholar
  15. 15.
    Ross-Hellauer, T.: What is open peer review? A systematic review. F1000Res. 6, 588 (2017)CrossRefGoogle Scholar
  16. 16.
    Chambers, C.D., Dienes, Z., McIntosh, R.D., Rotshtein, P., Willmes, K.: Registered reports: realigning incentives in scientific publishing. Cortex 66, A1–A2 (2015)CrossRefGoogle Scholar
  17. 17.
    Chambers, C.D.: Registered reports: a new publishing initiative at Cortex. Cortex 49(3), 609–610 (2013)CrossRefGoogle Scholar
  18. 18.
    Rajpert-De Meyts, E., Losito, S., Carrell, D.T.: Rewarding peer-review work: the Publons initiative. Andrology 4(6), 985–986 (2016)CrossRefGoogle Scholar
  19. 19.
    Kraker, P., Lex, E., Gorraiz, J., Gumpenberger, C., Peters, I.: Research data explored II: the anatomy and reception of figshare. arXiv preprint arXiv:1503.01298 (2015)
  20. 20.
    Allen, L., Dawson, S.: Scholarly publishing for the network generation. Insights 28(1), 57–61 (2015)CrossRefGoogle Scholar
  21. 21.
    Harnad, S., et al.: The access/impact problem and the green and gold roads to open access. Serials Rev. 30(4), 310–314 (2004)CrossRefGoogle Scholar
  22. 22.
    Fact sheet: Open Access in Horizon 2020. https://ec.europa.eu/programmes/horizon2020/sites/horizon2020/files/FactSheet_Open_Access.pdf. Accessed 08 Feb 2019
  23. 23.
    Bailey Jr., C.W.: Open access bibliography: liberating scholarly literature with e-prints and open access journals. Association of Research Libraries, Washington, DC (2005)Google Scholar
  24. 24.
    Fecher, B., Friesike, S.: Open science: one term, five schools of thought. In: Bartling, S., Friesike, S. (eds.) Opening Science, pp. 17–47. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-00026-8_2CrossRefGoogle Scholar
  25. 25.
    Molloy, J.C.: The open knowledge foundation: open data means better science. PLoS Biol. 9(12), e1001195 (2011)CrossRefGoogle Scholar
  26. 26.
    Lerner, J., Tirole, J.: The open source movement: Key research questions. Eur. Econ. Rev. 45(4–6), 819–826 (2001)CrossRefGoogle Scholar
  27. 27.
    González, A.G.: Open science: open source licenses in scientific research. NCJL Tech. 7, 321 (2005)Google Scholar
  28. 28.
    European Commission website. http://ec.europa.eu/research/openscience/index.cfm?pg=openaccess. Accessed 08 Feb 2019
  29. 29.
    Wasko, M., Teigland, R., Leidner, D., Jarvenpaa, S.: Stepping into the internet: new ventures in virtual worlds. MIS Q. 35(3), 645–652 (2011)CrossRefGoogle Scholar
  30. 30.
    Siponen, M., Willison, R.: A critical assessment of IS security research between 1990–2004. In: ECIS 2007 Proceedings (2007)Google Scholar
  31. 31.
    Bandara, W., Miskon, S., Fielt, E.: A systematic, tool-supported method for conducting literature reviews in information systems. In: ECIS 2011 Proceedings (2011)Google Scholar
  32. 32.
    Abbasi, A., Sarker, S., Chiang, R.H.: Big data research in information systems: toward an inclusive research agenda. J. Assoc. Inf. Syst. 17(2), 1–32 (2016)Google Scholar
  33. 33.
    Germonprez, M., Kendall, J.E., Kendall, K.E., Mathiassen, L., Young, B., Warner, B.: A theory of responsive design: a field study of corporate engagement with open source communities. Inf. Syst. Res. 28(1), 64–83 (2016)CrossRefGoogle Scholar
  34. 34.
    Hjalmarsson, A., Rudmark, D.: Designing digital innovation contests. In: Peffers, K., Rothenberger, M., Kuechler, B. (eds.) DESRIST 2012. LNCS, vol. 7286, pp. 9–27. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-29863-9_2CrossRefGoogle Scholar
  35. 35.
    Malgonde, O., Hevner, A.: Finding evidence for effectual application development on digital platforms. In: Maedche, A., vom Brocke, J., Hevner, A. (eds.) DESRIST 2017. LNCS, vol. 10243, pp. 330–347. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-59144-5_20CrossRefGoogle Scholar
  36. 36.
    Gregor, S., Hevner, A.R.: Positioning and presenting design science research for maximum impact. MIS Q. 37(2), 337–356 (2013)CrossRefGoogle Scholar
  37. 37.
    Samuel-Ojo, O., et al.: Meta-analysis of design science research within the IS community: trends, patterns, and outcomes. In: Winter, R., Zhao, J.L., Aier, S. (eds.) DESRIST 2010. LNCS, vol. 6105, pp. 124–138. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-13335-0_9CrossRefGoogle Scholar
  38. 38.
    Hariharan, A., et al.: Brownie: a platform for conducting NeuroIS experiments. J. Assoc. Inf. Syst. 18(4), 264 (2017)Google Scholar
  39. 39.
    Coenen, T., et al.: An information system design theory for the comparative judgement of competences. Eur. J. Inf. Syst. 27(2), 248–261 (2018)CrossRefGoogle Scholar
  40. 40.
    Meth, H., Mueller, B., Maedche, A.: Designing a requirement mining system. J. Assoc. Inf. Syst. 16(9), 799 (2015)Google Scholar
  41. 41.
    Kowatsch, T., et al.: Design and evaluation of a mobile chat app for the open source behavioral health intervention platform MobileCoach. In: Maedche, A., vom Brocke, J., Hevner, A. (eds.) DESRIST 2017. LNCS, vol. 10243, pp. 485–489. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-59144-5_36CrossRefGoogle Scholar
  42. 42.
    Mustafa, M.I., Sjöström, J., Lundström, J.E.: An empirical account of fitness-utility: a case of radical change towards mobility in DSR practice. In: Tremblay, M.C., VanderMeer, D., Rothenberger, M., Gupta, A., Yoon, V. (eds.) DESRIST 2014. LNCS, vol. 8463, pp. 289–303. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-06701-8_19CrossRefGoogle Scholar
  43. 43.
    Fedorschak, K., Kandala, S., Desouza, K.C., Krishnamurthy, R.: Data analytics and human trafficking. In: Tremblay, M.C., VanderMeer, D., Rothenberger, M., Gupta, A., Yoon, V. (eds.) DESRIST 2014. LNCS, vol. 8463, pp. 69–84. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-06701-8_5CrossRefGoogle Scholar
  44. 44.
    Nunamaker, J.F., Chen, M., Purdin, T.: Systems development in information systems research. J. Manage. Inf. Syst. 7(3), 89–106 (1990)CrossRefGoogle Scholar
  45. 45.
    Sein, M., Henfridsson, O., Purao, S., Rossi, M., Lindgren, R.: Action design research. MIS Q. 35(1), 37–56 (2011)CrossRefGoogle Scholar
  46. 46.
    Kuechler, B., Vaishnavi, V.: On theory development in design science research: anatomy of a research project. Eur. J. Inf. Syst. 17(5), 489–504 (2008)CrossRefGoogle Scholar
  47. 47.
    Mullarkey, M.T., Hevner, A.R.: An elaborated action design research process model. Eur. J. Inf. Syst. 1–15 (2018)Google Scholar
  48. 48.
    Walls, J., Widmeyer, G., El Sawy, O.: Building an information system design theory for vigilant EIS. Inf. Syst. Res. 3(1), 36–59 (1992)CrossRefGoogle Scholar
  49. 49.
    Hevner, A., March, S., Park, J., Ram, S.: Design science in information systems research. MIS Q. 28(1), 75–105 (2004)CrossRefGoogle Scholar
  50. 50.
    Schmeil, A., Eppler, M.J., de Freitas, S.: A structured approach for designing collaboration experiences for virtual worlds. J. Assoc. Inf. Syst. 13(10), 836 (2012)Google Scholar
  51. 51.
    vom Brocke, J., et al.: Tool-Support for Design Science Research: Design Principles and Instantiation (2017)Google Scholar
  52. 52.
    Mustafa, M.I., Sjöström, J.: Design principles for research data export: lessons learned in e-health design research. In: vom Brocke, J., Hekkala, R., Ram, S., Rossi, M. (eds.) DESRIST 2013. LNCS, vol. 7939, pp. 34–49. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-38827-9_3CrossRefGoogle Scholar
  53. 53.
    Gill, T.G., Hevner, A.R.: A fitness-utility model for design science research. In: Jain, H., Sinha, A.P., Vitharana, P. (eds.) DESRIST 2011. LNCS, vol. 6629, pp. 237–252. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-20633-7_17CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Victoria University of WellingtonWellingtonNew Zealand

Personalised recommendations