Advertisement

An Experimental Evaluation of the Understanding of Safety Compliance Needs with Models

  • Jose Luis de la Vara
  • Beatriz Marín
  • Clara Ayora
  • Giovanni Giachetti
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10650)

Abstract

Context: Most safety-critical systems have to fulfil compliance needs specified in safety standards. These needs can be difficult to understand from the text of the standards, and the use of conceptual models has been proposed as a solution. Goal: We aim to evaluate the understanding of safety compliance needs with models. Method: We have conducted an experiment to study the effectiveness, efficiency, and perceived benefits in understanding these needs, with text of safety standards and with UML object diagrams. Results: Sixteen Bachelor students participated in the experiment. Their average effectiveness in understanding compliance needs and their average efficiency were higher with models (17% and 15%, respectively). However, the difference is not statistically significant. The students found benefits in using models, but on average they are undecided about their ease of understanding. Conclusions: Although the results are not conclusive enough, they suggest that the use of models could improve the understanding of safety compliance needs.

Keywords

Safety-critical system Safety standard Safety compliance needs Model Understanding Comprehension Experiment 

Notes

Acknowledgments

The research leading to this paper has received funding from the AMASS project (H2020-ECSEL grant agreement no 692474; Spain’s MINECO ref. PCIN-2015-262) and the AMoDDI project (Ref. 11130583). We also thank the subjects that participated in the experiment.

References

  1. 1.
    Abdulkhaleq, A, Wagner, S.: A controlled experiment for the empirical evaluation of safety analysis techniques for safety-critical software. In: EASE 2015, pp. 16:1–16:10 (2015)Google Scholar
  2. 2.
    Abrahão, S., et al.: Assessing the effectiveness of sequence diagrams in the comprehension of functional requirements. IEEE Trans. Softw. Eng. 39(3), 327–342 (2013)CrossRefGoogle Scholar
  3. 3.
    Ayora, C., et al.: Variability management in process families through change patterns. Inform. Softw. Tech. 74, 86–104 (2016)CrossRefGoogle Scholar
  4. 4.
    Briand, L., et al.: Traceability and SysML design slices to support safety inspections: a controlled experiment. ACM Trans. Softw. Eng. Meth. 23(1), 9:1–9:43 (2014)CrossRefGoogle Scholar
  5. 5.
    de la Vara, J.L., et al.: An industrial survey on safety evidence change impact analysis practice. IEEE Trans. Softw. Eng. 42(12), 1095–1117 (2016)CrossRefGoogle Scholar
  6. 6.
    de la Vara, J.L., et al.: Do models improve the understanding of safety compliance needs? Insights from a pilot experiment. In: ESEM, pp. 32:1–32:6 (2016)Google Scholar
  7. 7.
    de la Vara, J.L., et al.: Model-based specification of safety compliance needs for critical systems: a holistic generic metamodel. Inform. Softw. Tech. 72, 16–30 (2016)CrossRefGoogle Scholar
  8. 8.
    de la Vara, J.L., et al.: An analysis of safety evidence management with the structured assurance case metamodel. Comput. Stand. Interfaces 50, 179–198 (2017)CrossRefGoogle Scholar
  9. 9.
    De Lucia, A., et al.: An experimental comparison of ER and UML class diagrams for data modelling. Empir. Softw. Eng. 15(5), 455–492 (2010)CrossRefGoogle Scholar
  10. 10.
  11. 11.
    Labunets, K., et al.: An experimental comparison of two risk-based security methods. In: ESEM, pp 163–172 (2013)Google Scholar
  12. 12.
    Nair, S., et al.: An extended systematic literature review on provision of evidence for safety certification. Inform. Softw. Tech. 56(7), 689–717 (2014)CrossRefGoogle Scholar
  13. 13.
    Nair, S., et al.: Evidence management for compliance of critical systems with safety standards: a survey on the state of practice. Inform. Softw. Tech. 60, 1–15 (2015)CrossRefGoogle Scholar
  14. 14.
    Panesar-Walawege, R.K., et al.: Supporting the verification of compliance to safety standards via model-driven engineering. Inform. Softw. Tech. 55(5), 836–864 (2013)CrossRefGoogle Scholar
  15. 15.
    Razali, R., et al.: Experimental comparison of the comprehensibility of a UML-based formal specification versus a textual one. In: EASE (2007)Google Scholar
  16. 16.
    Salman, I., et al.: Are students representatives of professionals in software engineering experiments? In: ICSE (2015)Google Scholar
  17. 17.
    Sharafi, Z., et al.: An empirical study on the efficiency of graphical vs. textual representations in requirements comprehension. In: ICPC (2013)Google Scholar
  18. 18.
    Vegas, S., et al.: Crossover designs in software engineering experiments: benefits and perils. IEEE Trans. Softw. Eng. 42(2), 120–135 (2016)CrossRefGoogle Scholar
  19. 19.
    Wohlin, C., et al.: Experimentation in Software Engineering, 2nd edn. Springer, Heidelberg (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Jose Luis de la Vara
    • 1
  • Beatriz Marín
    • 2
  • Clara Ayora
    • 3
  • Giovanni Giachetti
    • 4
  1. 1.Departamento de InformáticaUniversidad Carlos III de MadridLeganésSpain
  2. 2.Facultad de IngenieríaUniversidad Diego PortalesSantiagoChile
  3. 3.R&D DepartmentTreelogicMadridSpain
  4. 4.Universidad Tecnológica de Chile INACAPSantiagoChile

Personalised recommendations