Skip to main content

Comparing Different Visualizations for Feedback on Test Execution in a Model-Driven Engineering Environment

  • Conference paper
  • First Online:
Enterprise, Business-Process and Information Systems Modeling (BPMDS 2023, EMMSAD 2023)

Abstract

In Model-Driven Engineering (MDE), source code can be automatically generated from models such as a class diagram and statecharts. However, even under the assumption that a model is correctly translated into executable code, there is no guarantee that the models correctly capture the user requirements. The validity of a model can be asserted by means of model execution or testing the (prototype) application generated from the model. The completeness of such validation effort can be expressed in terms of model coverage of the executed scenarios. TesCaV is a Model-Based Testing (MBT) tool that provides users feedback by visualizing which test cases have been performed and which ones not yet. This allows TesCaV to be used in an educational setting as its feedback about the manual test cases can be alleviated to let students understand how to adequately test a software system. However, it remains unclear what the best way is to provide this feedback in terms of providing the user maximal information with minimal cognitive load. This research evaluates several proposed visualizations created according to information visualization principles, and makes a ranking based on a questionnaire distributed to 45 participants.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The approval for the distribution of this questionnaire can be found under SMEC number G-2023-6238-R2(MIN).

References

  1. Agresti, A.: Categorical Data Analysis, vol. 792. Wiley, Hoboken (2012)

    MATH  Google Scholar 

  2. Cammaerts, F., Verbruggen, C., Snoeck, M.: Investigating the effectiveness of model-based testing on testing skill acquisition. In: Barn, B.S., Sandkuhl, K. (eds.) PoEM 2022, pp. 3–17. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-21488-2_1

    Chapter  Google Scholar 

  3. Card, S.: Information visualization. In: Human-Computer Interaction, pp. 199–234. CRC Press (2009)

    Google Scholar 

  4. Carver, J.C., Kraft, N.A.: Evaluating the testing ability of senior-level computer science students. In: 2011 24th IEEE-CS Conference on Software Engineering Education and Training (CSEE &T), pp. 169–178. IEEE (2011)

    Google Scholar 

  5. Cho, V., Cheng, T.E., Lai, W.J.: The role of perceived user-interface design in continued usage intention of self-paced e-learning tools. Comput. Educ. 53(2), 216–227 (2009)

    Article  Google Scholar 

  6. Cleveland, W.S., McGill, R.: Graphical perception: theory, experimentation, and application to the development of graphical methods. J. Am. Stat. Assoc. 79(387), 531–554 (1984)

    Article  Google Scholar 

  7. Cordova, L., Carver, J., Gershmel, N., Walia, G.: A comparison of inquiry-based conceptual feedback vs. traditional detailed feedback mechanisms in software testing education: an empirical investigation. In: Proceedings of the 52nd ACM Technical Symposium on Computer Science Education, pp. 87–93 (2021)

    Google Scholar 

  8. Cowling, T.: Stages in teaching software testing. In: 2012 34th International Conference on Software Engineering (ICSE), pp. 1185–1194. IEEE (2012)

    Google Scholar 

  9. Dalal, S.R., et al.: Model-based testing in practice. In: Proceedings of the 21st International Conference on Software Engineering, pp. 285–294 (1999)

    Google Scholar 

  10. Edwards, S.H.: Teaching software testing: automatic grading meets test-first coding. In: Companion of the 18th Annual ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications, pp. 318–319 (2003)

    Google Scholar 

  11. Fraser, G., Gambi, A., Kreis, M., Rojas, J.M.: Gamifying a software testing course with code defenders. In: Proceedings of the 50th ACM Technical Symposium on Computer Science Education, pp. 571–577 (2019)

    Google Scholar 

  12. Krasner, H.: The cost of poor software quality in the US: a 2020 report. In: Proceedings of Consortium Information and Software, QualityTM (CISQTM) (2021)

    Google Scholar 

  13. Lewis, R., Stoney, S., Wild, M.: Motivation and interface design: maximising learning opportunities. J. Comput. Assist. Learn. 14(1), 40–50 (1998)

    Article  Google Scholar 

  14. Marín, B., Alarcón, S., Giachetti, G., Snoeck, M.: TesCaV: an approach for learning model-based testing and coverage in practice. In: Dalpiaz, F., Zdravkovic, J., Loucopoulos, P. (eds.) RCIS 2020. LNBIP, vol. 385, pp. 302–317. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-50316-1_18

    Chapter  Google Scholar 

  15. Marín, B., Gallardo, C., Quiroga, D., Giachetti, G., Serral, E.: Testing of model-driven development applications. Softw. Qual. J. 25, 407–435 (2017)

    Article  Google Scholar 

  16. Marín, B., Vos, T.E., Paiva, A.C., Fasolino, A.R., Snoeck, M.: ENACTEST-European innovation alliance for testing education. In: RCIS Workshops (2022)

    Google Scholar 

  17. Pérez, C., Marín, B.: Automatic generation of test cases from UML models. CLEI Electron. J 21(1) (2018)

    Google Scholar 

  18. Rhee, C., Moon, J., Choe, Y.: Web interface consistency in e-learning. Online Inf. Rev. 30(1), 53–69 (2006)

    Article  Google Scholar 

  19. Rojas, J.M., Fraser, G.: Code defenders: a mutation testing game. In: 2016 IEEE Ninth International Conference on Software Testing, Verification and Validation Workshops (ICSTW), pp. 162–167. IEEE (2016)

    Google Scholar 

  20. Sarkar, A., Bell, T.: Teaching black-box testing to high school students. In: Proceedings of the 8th Workshop in Primary and Secondary Computing Education, pp. 75–78 (2013)

    Google Scholar 

  21. Shah, A.R.: Web-cat: a web-based center for automated testing. Ph.D. thesis, Virginia Tech (2003)

    Google Scholar 

  22. Spacco, J., Hovemeyer, D., Pugh, W., Emad, F., Hollingsworth, J.K., Padua-Perez, N.: Experiences with marmoset: designing and using an advanced submission and testing system for programming courses. ACM SIGCSE Bull. 38(3), 13–17 (2006)

    Article  Google Scholar 

  23. Spence, R.: Information Visualization, vol. 1. Springer, Heidelberg (2001)

    Google Scholar 

  24. Utting, M., Legeard, B.: Practical Model-Based Testing: A Tools Approach. Elsevier, Amsterdam (2010)

    Google Scholar 

Download references

Acknowledgment

This paper has been funded by the ENACTEST Erasmus+ project number 101055874.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Felix Cammaerts .

Editor information

Editors and Affiliations

Appendix

Appendix

Fig. 5.
figure 5

The preferred visualization for each criterion.

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cammaerts, F., Snoeck, M. (2023). Comparing Different Visualizations for Feedback on Test Execution in a Model-Driven Engineering Environment. In: van der Aa, H., Bork, D., Proper, H.A., Schmidt, R. (eds) Enterprise, Business-Process and Information Systems Modeling. BPMDS EMMSAD 2023 2023. Lecture Notes in Business Information Processing, vol 479. Springer, Cham. https://doi.org/10.1007/978-3-031-34241-7_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-34241-7_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-34240-0

  • Online ISBN: 978-3-031-34241-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics