Advertisement

The Impact of Testcases on the Maintainability of Declarative Process Models

  • Stefan Zugal
  • Jakob Pinggera
  • Barbara Weber
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 81)

Abstract

Declarative approaches to process modeling are regarded well suited for highly volatile environments as they provide a high degree of flexibility. However, problems in understanding and maintaining declarative process models impede their usage. To compensate for these shortcomings Test Driven Modeling has been proposed. This paper reports from a controlled experiment evaluating the impact of Test Driven Modeling, in particular the adoption of testcases, on process model maintenance. Thereby, students modified declarative process models, one model with the support of testcases and one model without the support of testcases. Data gathered in this experiment shows that the adoption of testcases significantly lowers cognitive load and increases perceived quality of changes. In addition, modelers who had testcases at hand performed significantly more change operations, while at the same time the quality of process models did not decrease.

Keywords

Declarative Business Process Models Test Driven Modeling Empirical Research 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Lenz, R., Reichert, M.: IT support for healthcare processes - premises, challenges, perspectives. DKE 61(1), 39–58 (2007)CrossRefGoogle Scholar
  2. 2.
    Poppendieck, M., Poppendieck, T.: Implementing Lean Software Development: From Concept to Cash. Addison-Wesley Professional, Reading (2006)Google Scholar
  3. 3.
    Dumas, M., van der Aalst, W.M.P., ter Hofstede, A.H.: Process Aware Information Systems: Bridging People and Software Through Process Technology. Wiley Interscience, Hoboken (2005)CrossRefGoogle Scholar
  4. 4.
    Weske, M.: Business Process Management: Concepts, Methods, Technology. Springer, Heidelberg (2007)Google Scholar
  5. 5.
    Reichert, M., Dadam, P.: ADEPTflex: Supporting Dynamic Changes of Workflow without Losing Control. JIIS 10(2), 93–129 (1998)Google Scholar
  6. 6.
    Weske, M.: Workflow Management Systems: Formal Foundation, Conceptual Design, Implementation Aspects. PhD thesis, University of Münster (2000)Google Scholar
  7. 7.
    van der Aalst, W.M.P., Weske, M.: Case handling: a new paradigm for business process support. DKE 53(2), 129–162 (2005)CrossRefGoogle Scholar
  8. 8.
    Pesic, M., Schonenberg, H., Sidorova, N., van der Aalst, W.M.P.: Constraint-Based Workflow Models: Change Made Easy. In: Proc. CoopIS 2007, pp. 77–94 (2007)Google Scholar
  9. 9.
    Müller, D., Reichert, M., Herbst, J.: A New Paradigm for the Enactment and Dynamic Adaptation of Data-Driven Process Structures. In: Bellahsène, Z., Léonard, M. (eds.) CAiSE 2008. LNCS, vol. 5074, pp. 48–63. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  10. 10.
    Sadiq, S.W., Orlowska, M.E., Sadiq, W.: Specification and validation of process constraints for flexible workflows. ISJ 30(5), 349–378 (2005)Google Scholar
  11. 11.
    Weber, B., Reichert, M., Rinderle, S.: Change Patterns and Change Support Features - Enhancing Flexibility in Process-Aware Information Systems. DKE 66(3), 438–466 (2008)CrossRefGoogle Scholar
  12. 12.
    Pesic, M.: Constraint-Based Workflow Management Systems: Shifting Control to Users. PhD thesis, TU Eindhoven (2008)Google Scholar
  13. 13.
    Weber, B., Reijers, H.A., Zugal, S., Wild, W.: The Declarative Approach to Business Process Execution: An Empirical Test. In: van Eck, P., Gordijn, J., Wieringa, R. (eds.) CAiSE 2009. LNCS, vol. 5565, pp. 470–485. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  14. 14.
    Zugal, S., Pinggera, J., Weber, B.: Toward Enhanced Life-Cycle Support for Declarative Processes. JSME (to appear)Google Scholar
  15. 15.
    Green, T.R.G., Petre, M.: Usability Analysis of Visual Programming Environments: A ’Cognitive Dimensions’ Framework. JVLC 7(2), 131–174 (1996)Google Scholar
  16. 16.
    Agrawal, H., Horgan, J.R., Krauser, E.W., London, S.: Incremental Regression Testing. In: Proc. ICSM 1993, pp. 348–357 (1993)Google Scholar
  17. 17.
    Beck, K.: Test Driven Development: By Example. Addison-Wesley, Reading (2002)Google Scholar
  18. 18.
    Marchenko, A., Abrahamsson, P., Ihme, T.: Long-Term Effects of Test-Driven Development A Case Study. In: Abrahamsson, P., Marchesi, M., Maurer, F. (eds.) Agile Processes in Software Engineering and Extreme Programming. Lecture Notes in Business Information Processing, vol. 31, pp. 13–22. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  19. 19.
    Canfora, G., Cimitile, A., Garcia, F., Piattini, M., Visaggio, C.A.: Evaluating advantages of test driven development: a controlled experiment with professionals. In: Proc. ISESE 2006, pp. 364–371 (2006)Google Scholar
  20. 20.
    George, B., Williams, L.: A structured experiment of test-driven development. Information and Software Technology 46(5), 337–342 (2004)CrossRefGoogle Scholar
  21. 21.
    Pinggera, J., Zugal, S., Weber, B.: Investigating the process of process modeling with cheetah experimental platform. In: Proc. ER-POIS 2010, pp. 13–18 (2010)Google Scholar
  22. 22.
    Khatri, V., Vessey, I., Ramesh, P.C.V., Park, S.-J.: Understanding Conceptual Schemas: Exploring the Role of Application and IS Domain Knowledge. Information Systems Research 17(1), 81–99 (2006)CrossRefGoogle Scholar
  23. 23.
    Wohlin, C., Runeson, R., Halst, M., Ohlsson, M.C., Regnell, B., Wesslen, A.: Experimentation in Software Engineering: an Introduction. Kluwer, Dordrecht (2000)CrossRefGoogle Scholar
  24. 24.
    Pett, M.A.: Nonparametric Statistics for Health Care Research: Statistics for Small Samples and Unusual Distributions. Sage Publications, Thousand Oaks (1997)Google Scholar
  25. 25.
    Beck, K.: Extreme Programming Explained: Embracing Change. Addison-Wesley, Reading (1999)Google Scholar
  26. 26.
    Royall, R.M.: The Effect of Sample Size on the Meaning of Significance Tests. The American Statistician 40(4), 313–315 (1986)Google Scholar
  27. 27.
    Ly, L.T., Rinderle, S., Dadam, P.: Integration and verification of semantic constraints in adaptive process management systems. DKE 64(1), 3–23 (2008)CrossRefGoogle Scholar
  28. 28.
    Awad, A., Decker, G., Weske, M.: Efficient Compliance Checking Using BPMN-Q and Temporal Logic. In: Dumas, M., Reichert, M., Shan, M.-C. (eds.) BPM 2008. LNCS, vol. 5240, pp. 326–341. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  29. 29.
    van der Aalst, W.M.P., de Beer, H.T., van Dongen, B.F.: Process mining and verification of properties: An approach based on temporal logic. In: Chung, S. (ed.) OTM 2005. LNCS, vol. 3760, pp. 130–147. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  30. 30.
    Fahland, D.: From Scenarios To Components. PhD thesis, Humboldt-Universität zu Berlin (2010)Google Scholar
  31. 31.
    Glinz, M., Seybold, C., Meier, S.: Simulation-Driven Creation, Validation and Evolution of Behavioral Requirements Models. In: Proc. MBEES 2007, pp. 103–112 (2007)Google Scholar
  32. 32.
    Osterweil, L.J.: Software processes are software too. In: Proc. ICSE 1987, pp. 2–13 (1987)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Stefan Zugal
    • 1
  • Jakob Pinggera
    • 1
  • Barbara Weber
    • 1
  1. 1.University of InnsbruckAustria

Personalised recommendations