Software & Systems Modeling

, Volume 18, Issue 4, pp 2507–2530 | Cite as

Using language workbenches and domain-specific languages for safety-critical software development

  • Markus VoelterEmail author
  • Bernd Kolb
  • Klaus Birken
  • Federico Tomassetti
  • Patrick Alff
  • Laurent Wiart
  • Andreas Wortmann
  • Arne Nordmann
Regular Paper


Language workbenches support the efficient creation, integration, and use of domain-specific languages. Typically, they execute models by code generation to programming language code. This can lead to increased productivity and higher quality. However, in safety-/mission-critical environments, generated code may not be considered trustworthy, because of the lack of trust in the generation mechanisms. This makes it harder to justify the use of language workbenches in such an environment. In this paper, we demonstrate an approach to use such tools in critical environments. We argue that models created with domain-specific languages are easier to validate and that the additional risk resulting from the transformation to code can be mitigated by a suitably designed transformation and verification architecture. We validate the approach with an industrial case study from the healthcare domain. We also discuss the degree to which the approach is appropriate for critical software in space, automotive, and robotics systems.


Domain-specific languages Safety-critical software development Case study Language workbenches 



The authors would like to thank the team at Voluntis and itemis who built the system that underlies the case study. These include Wladimir Safonov, Jürgen Haug, Sergej Koščejev, Alexis Archambault, Nikhil Khandelwal. We would also like to thank Richard Paige and Sebastian Zarnekow for their feedback on drafts of the paper.

Supplementary material


  1. 1.
    Amrani, M., Combemale, B., Lucio, L., Selim, G.M.K., Dingel, J., Traon, Y.L., Vangheluwe, H., Cordy, J.R.: Formal verification techniques for model transformations: a tridimensional classification. J. Object Technol. 14(3), 1:1–43 (2015). CrossRefGoogle Scholar
  2. 2.
    Arkin, B., Stender, S., McGraw, G.: Software penetration testing. IEEE Secur. Priv. 3(1), 84–87 (2005)CrossRefGoogle Scholar
  3. 3.
    Beine, M., Otterbach, R., Jungmann, M.: Development of safety-critical software using automatic code generation. Technical Report, SAE Technical Paper (2004)Google Scholar
  4. 4.
    Bettini, L.: Implementing Domain-Specific Languages with Xtext and Xtend. Packt Publishing Ltd, Birmingham (2016)Google Scholar
  5. 5.
    Boehm, B.W., et al.: Software Engineering Economics, vol. 197. Prentice-hall, Englewood Cliffs (1981)zbMATHGoogle Scholar
  6. 6.
    Broy, M., Kirstan, S., Krcmar, H., Schätz, B., Zimmermann, J.: What is the benefit of a model-based design of embedded software systems in the car industry? Softw Des Dev Concepts Methodol Tools Appl Concepts Methodol Tools Appl, p.310 (2013).
  7. 7.
    Bruckhaus, T., Madhavii, N., Janssen, I., Henshaw, J.: The impact of tools on software productivity. IEEE Softw. 13(5), 29–38 (1996)CrossRefGoogle Scholar
  8. 8.
    Buckl, C., Regensburger, M., Knoll, A., Schrott, G.: Models for automatic generation of safety-critical real-time systems. In: ARES 2007 Conference. IEEE (2007)Google Scholar
  9. 9.
    Chlipala, A.: A verified compiler for an impure functional language. ACM SIGPLAN Not. 45, 93–106 (2010)CrossRefzbMATHGoogle Scholar
  10. 10.
    Claessen, K., Hughes, J.: Quickcheck: a lightweight tool for random testing of haskell programs. Acm SIGPLAN Not. 46(4), 53–64 (2011)CrossRefGoogle Scholar
  11. 11.
    Conmy, P., Paige, R.F.: Challenges when using model driven architecture in the development of safety critical software. In: 4th Intl. Workshop on Model-Based Methodologies for Pervasive and Embedded Software. IEEE (2007)Google Scholar
  12. 12.
    Conrad, M.: Verification and validation according to iso 26262: a workflow to facilitate the development of high-integrity software. In: ERTS2 Conference 2012Google Scholar
  13. 13.
    Cousot, P., Cousot, R., Feret, J., Mauborgne, L., Miné, A., Monniaux, D., Rival, X.: The astrée analyzer. In: Esop, vol. 5, pp. 21–30. Springer (2005)Google Scholar
  14. 14.
    Cuoq, P., Kirchner, F., Kosmatov, N., Prevosto, V., Signoles, J., Yakobowski, B.: Frama-c. In: International Conference on Software Engineering and Formal Methods. Springer (2012)Google Scholar
  15. 15.
    Dahlweid, M., Moskal, M., Santen, T., Tobies, S., Schulte, W.: Vcc: Contract-based modular verification of concurrent c. In: ICSE Companion (2009)Google Scholar
  16. 16.
    Dormoy, F.-X.: Scade 6: a model based solution for safety critical software development. In: Proceedings of the 4th European Congress on Embedded Real Time Software (ERTS’08), pp. 1–9 (2008)Google Scholar
  17. 17.
    Erdweg, S., Van Der Storm, T., Völter, M., Boersma, M., Bosman, R., Cook, W. R., Gerritsen, A., Hulshout, A., Kelly, S., Loh, A., et al.: The state of the art in language workbenches. In: International Conference on Software Language Engineering, pp. 197–217. Springer (2013)Google Scholar
  18. 18.
    Eysholdt, M.: Executable specifications for xtext. Website (2014).
  19. 19.
    Florence, S.P., Fetscher, B., Flatt, M., Temps, W.H., Kiguradze, T., West, D.P., Niznik, C., Yarnold, P.R., Findler, R.B., Belknap, S.M.: Pop-pl: a patient-oriented prescription programming language. ACM SIGPLAN Not. 51, 131–140 (2015)CrossRefGoogle Scholar
  20. 20.
    Görke, S., Riebeling, R., Kraus, F., Reichel, R.: Flexible platform approach for fly-by-wire systems. In: 2013 IEEE/AIAA Digital Avionics Systems Conference. IEEE (2013)Google Scholar
  21. 21.
    Halang, W.A., Zalewski, J.: Programming languages for use in safety-related applications. Ann. Rev. Control (2003). Google Scholar
  22. 22.
    Hanmer, R.: Patterns for Fault Tolerant Software. Wiley, Hoboken (2013)Google Scholar
  23. 23.
    Hart, B.: Sdr security threats in an open source world. In: Software Defined Radio Conference, pp. 3–5 (2004)Google Scholar
  24. 24.
    Haxthausen, A.E., Peleska, J.: A domain specific language for railway control systems. In: Proc. of the 6th biennial world conference on integrated design and process technology (2002)Google Scholar
  25. 25.
    Hermans, F., Pinzger, M., Van Deursen, A.: Domain-specific languages in practice: a user study on the success factors. In: International Conference on Model Driven Engineering Languages and Systems, pp. 423–437. Springer (2009)Google Scholar
  26. 26.
    Hickey, P.C., Pike, L., Elliott, T., Bielman, J., Launchbury, J.: Building embedded systems with embedded dsls. ACM SIGPLAN Not. 49, 3–9 (2014)CrossRefGoogle Scholar
  27. 27.
    Holzmann, G.: Spin Model Checker, the: Primer and Reference Manual. Addison-Wesley Professional, Boston (2003)Google Scholar
  28. 28.
    Huang, W.-l., Peleska, J.: Exhaustive model-based equivalence class testing. In: IFIP International Conference on Testing Software and Systems, pp. 49–64. Springer (2013)Google Scholar
  29. 29.
    Kärnä, J., Tolvanen, J.-P., Kelly, S.: Evaluating the use of domain-specific modeling in practice. In: Proceedings of the 9th OOPSLA Workshop on Domain-Specific Modeling (2009)Google Scholar
  30. 30.
    Kats, L.C., Vermaas, R., Visser, E.: Integrated language definition testing: enabling test-driven language development. ACM SIGPLAN Not. 46, 139–154 (2011)CrossRefGoogle Scholar
  31. 31.
    Kieburtz, R. B., McKinney, L., Bell, J. M., Hook, J., Kotov, A., Lewis, J., Oliva, D. P., Sheard, T., Smith, I., Walton, L.: A software engineering experiment in software component generation. In: Proceedings of the 18th International Conference on Software Engineering, pp. 542–552. IEEE Computer Society (1996)Google Scholar
  32. 32.
    Koopman, P.: Embedded Software Costs 15–40 per line of code (Update: 25–50). (URL too long)
  33. 33.
    Koopman, P.: Risk areas in embedded software industry projects. In: 2010 Workshop on Embedded Systems Education. ACM (2010)Google Scholar
  34. 34.
    Kosar, T., Mernik, M., Carver, J.C.: Program comprehension of domain-specific and general-purpose languages: comparison using a family of experiments. Empir. Softw. Eng. 17(3), 276–304 (2012)CrossRefGoogle Scholar
  35. 35.
    Kroening, D., Tautschnig, M.: Cbmc–c bounded model checker. In: International Conference on Tools and Algorithms for the Construction and Analysis of Systems, pp. 389–391. Springer (2014)Google Scholar
  36. 36.
    Kuhn, A., Murphy, G.C., Thompson, C.A.: An exploratory study of forces and frictions affecting large-scale model-driven development. In: International Conference on Model Driven Engineering Languages and Systems, pp. 352–367. Springer (2012)Google Scholar
  37. 37.
    Kumar, R., Myreen, M.O., Norrish, M., Owens, S.: Cakeml: a verified implementation of ml. ACM SIGPLAN Not. 49, 179–191 (2014)zbMATHGoogle Scholar
  38. 38.
    Lämmel, R.: Grammar testing. In: Proceedings of the 4th International Conference on Fundamental Approaches to Software Engineering (2001)Google Scholar
  39. 39.
    Ledinot, E., Astruc, J.-M., Blanquart, J.-P., Baufreton, P., Boulanger, J.-L., Delseny, H., Gassino, J., Ladier, G., Leeman, M., Machrouh, J., et al.: A cross-domain comparison of software development assurance standards. In: Proc. of ERTS 2012Google Scholar
  40. 40.
    Leroy, X.: Formal verification of a realistic compiler. Commun. ACM 52(7), 107–115 (2009)CrossRefGoogle Scholar
  41. 41.
    Lewis, J.: Cryptol: specification, implementation and verification of high-grade cryptographic applications. In: Proceedings of the 2007 ACM workshop on Formal methods in security engineering, pp. 41–41. ACM (2007)Google Scholar
  42. 42.
    Liebel, G., Marko, N., Tichy, M., Leitner, A., Hansson, J.: Assessing the state-of-practice of model-based engineering in the embedded systems domain. In: International Conference on Model Driven Engineering Languages and Systems, pp. 166–182. Springer (2014)Google Scholar
  43. 43.
    Liggesmeyer, P., Trapp, M.: Trends in embedded software engineering. IEEE Softw. 26(3), 19–25 (2009)CrossRefGoogle Scholar
  44. 44.
    Lúcio, L., Barroca, B., Amaral, V.: A technique for automatic validation of model transformations. In: MODELS 2010. Springer (2010)Google Scholar
  45. 45.
    Méry, D., Schätz, B., Wassyng, A.: The pacemaker challenge: developing certifiable medical devices (dagstuhl seminar 14062). In: Dagstuhl Reports, vol. 4. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik (2014)Google Scholar
  46. 46.
    Michailidis, A., Spieth, U., Ringler, T., Hedenetz, B., Kowalewski, S.: Test front loading in early stages of automotive software development based on autosar. In: DATE 2010. IEEEGoogle Scholar
  47. 47.
    Motor Industry Software Reliability Association and Motor Industry Software Reliability Association staff: MISRA C: 2012: Guidelines for the Use of the C Language in Critical Systems. Motor Industry Research Association (2013)Google Scholar
  48. 48.
    Molotnikov, Z., Völter, M., Ratiu, D.: Automated domain-specific c verification with mbeddr. In: Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering, pp. 539–550. ACM (2014)Google Scholar
  49. 49.
    Munier, P.: Polyspace®. Industrial Use of Formal Methods: Formal Verification, pp. 123–153 (2012). Accessed 10 Apr 2018
  50. 50.
    Myers, G .J.: Software Reliability. Wiley, Hoboken (1976)Google Scholar
  51. 51.
    Myers, G.J.: A controlled experiment in program testing and code walkthroughs/inspections. Commun. ACM 21(9), 760–768 (1978)CrossRefGoogle Scholar
  52. 52.
    Nguyen-Tuong, A., Guarnieri, S., Greene, D., Shirley, J., Evans, D.: Automatically hardening web applications using precise tainting. In: IFIP International Information Security Conference. Springer, (2005)Google Scholar
  53. 53.
    Pajic, M., Jiang, Z., Lee, I., Sokolsky, O., Mangharam, R.: Safety-critical medical device development using the upp2sf model translation tool. ACM Trans. Embed. Comput. Syst. (TECS) 13(4s), 127 (2014)Google Scholar
  54. 54.
    Ratiu, D., Voelter, M.: Automated testing of DSL implementations. In: 11th IEEE/ACM International Workshop on Automation of Software Test (AST 2016) (2016)Google Scholar
  55. 55.
    Ratiu, D., Schaetz, B., Voelter, M., Kolb, B.: Language engineering as an enabler for incrementally defined formal analyses. In: Proceedings of the First International Workshop on Formal Methods in Software Engineering: Rigorous and Agile Approaches, pp. 9–15. IEEE Press (2012)Google Scholar
  56. 56.
    Ratiu, D., Zeller, M., Killian, L.: Safety.lab: model-based domain specific tooling for safety argumentation. In: International Conference on Computer Safety, Reliability, and Security, pp. 72–82. Springer (2014)Google Scholar
  57. 57.
    Réveillère, L., Mérillon, F., Consel, C., Marlet, R., Muller, G.: A dsl approach to improve productivity and safety in device drivers development. In: ASE 2000. IEEEGoogle Scholar
  58. 58.
    Santhanam, V.: The anatomy of an faa-qualifiable ada subset compiler. In: ACM SIGAda Ada Letters, vol. 23, pp. 40–43. ACM (2002)Google Scholar
  59. 59.
    Svendsen, A., Olsen, G. K., Endresen, J., Moen, T., Carlson, E., Alme, K.-J., Haugen, Ø.: The future of train signaling. In: International Conference on Model Driven Engineering Languages and Systems, pp. 128–142. Springer (2008)Google Scholar
  60. 60.
    Tolvanen, J.-P., Djukić, V., Popovic, A.: Metamodeling for medical devices: code generation, model-debugging and run-time synchronization. Procedia Comput. Sci. 63, 539–544 (2015)CrossRefGoogle Scholar
  61. 61.
    Van Deursen, A., Klint, P., Visser, J.: Domain-specific languages: an annotated bibliography. ACM SIGPLAN Not. 35(6), 26–36 (2000)CrossRefGoogle Scholar
  62. 62.
    Vergu, V., Neron, P., Visser, E.: Dynsem: A dsl for dynamic semantics specification. Technical Report, Delft University of Technology, Software Engineering Research Group (2015)Google Scholar
  63. 63.
    Visser, E., Wachsmuth, G., Tolmach, A., Neron, P., Vergu, V., Passalaqua, A., Konat, G.: A language designer’s workbench: a one-stop-shop for implementation and verification of language designs. In: Proc. of the 2014 ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software. ACM (2014)Google Scholar
  64. 64.
    Voelter, M.: Language and ide modularization and composition with mps. In: Generative and Transformational Techniques in Software Engineering IV, pp. 383–430. Springer (2013)Google Scholar
  65. 65.
    Voelter, M.: Generic Tools, Specific Languages. TU Delft Delft University of Technology, Delft (2014)Google Scholar
  66. 66.
    Voelter, M., Lisson, S.: Supporting diverse notations in MPS’ Projectional Editor. GEMOC WorkshopGoogle Scholar
  67. 67.
    Voelter, M., Molotnikov, Z., Kolb, B.: Towards improving software security using language engineering and mbeddr c. In: Proceeding of the Workshop on Domain-Specific Modeling 2015, pp. 55–62. Pittsburgh, PA, USA, 27–27 October 2015Google Scholar
  68. 68.
    Voelter, M., Ratiu, D., Kolb, B., Schaetz, B.: mbeddr: Instantiating a language workbench in the embedded software domain. Autom. Softw. Eng. 20(3), 339–390 (2013)CrossRefGoogle Scholar
  69. 69.
    Voelter, M., Ratiu, D., Tomassetti, F.: Requirements as first-class citizens: integrating requirements closely with implementation artifacts. In: ACESMB@ MoDELS (2013)Google Scholar
  70. 70.
    Voelter, M., Deursen, A. v., Kolb, B., Eberle, S.: Using C language extensions for developing embedded software: a case study In: Proceedings of the 2015 ACM SIGPLAN International Conference on Object-Oriented Programming, Systems, Languages, and Applications, pp. 655–674, Pittsburgh, PA, USA, 25–30 October 2015Google Scholar
  71. 71.
    Voelter, M., van Deursen, A., Kolb, B., Eberle, S.: Using c language extensions for developing embedded software: a case study. In: OOPSLA 2015 (2015)Google Scholar
  72. 72.
    Voelter, M., Kolb, B., Szabó, T., Ratiu, D., van Deursen, A.: Lessons learned from developing mbeddr: a case study in language engineering with mps. Softw. Syst. Model., pp. 1–46 (2017).
  73. 73.
    Voelter, M., SzabÓ, T., Engelmann, B.: An Overview of Program Analysis using Formal Methods. Self-published (2017).
  74. 74.
    Wallace, M.: Modular architectural representation and analysis of fault propagation and transformation. Electron. Notes Theor. Comput. Sci. 141(3), 53–71 (2005)CrossRefGoogle Scholar
  75. 75.
    Weiser, M., Gannon, J.D., McMullin, P.R.: Comparison of structural test coverage metrics. IEEE Softw. 2(2), 80 (1985)CrossRefGoogle Scholar
  76. 76.
    Whalen, M.W., Heimdahl, M.P.E.: An approach to automatic code generation for safety-critical systems. In: 14th IEEE International Conference on Automated Software Engineering, 1999, pp 315–318. IEEE (1999)Google Scholar
  77. 77.
    Wing, J.M.: Computational thinking. Commun. ACM 49(3), 33–35 (2006)CrossRefGoogle Scholar
  78. 78.
    Wortmann, A., Beet, M.: Domain specific languages for efficient satellite control software development. In: DASIA 2016, vol 736 (2016)Google Scholar
  79. 79.
    Wu, H., Gray, J.G., Mernik, M.: Unit testing for domain-specific languages. In: Domain-Specific Languages, IFIP TC 2 Working Conference, DSL 2009, Oxford, UK, July 15-17, 2009, Proceedings, pp. 125–147 (2009)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  • Markus Voelter
    • 1
    Email author
  • Bernd Kolb
    • 2
  • Klaus Birken
    • 2
  • Federico Tomassetti
    • 3
  • Patrick Alff
    • 4
  • Laurent Wiart
    • 4
  • Andreas Wortmann
    • 5
  • Arne Nordmann
    • 6
  1. 1.independent/itemis AGStuttgartGermany
  2. 2.itemis AGStuttgartGermany
  3. 3.TurinItaly
  4. 4.VoluntisParisFrance
  5. 5.OHB System AGBremenGermany
  6. 6.Bosch Corporate ResearchStuttgartGermany

Personalised recommendations