Advertisement

Empirical Software Engineering

, Volume 19, Issue 1, pp 39–68 | Cite as

Contributions of model checking and CoFI methodology to the development of space embedded software

  • Rodrigo Pastl Pontes
  • Paulo Claudino Véras
  • Ana Maria Ambrosio
  • Emília Villani
Article

Abstract

The role of embedded software in the last space accidents highlights the importance of verification and validation techniques for the development of space embedded software. In this context, this work analyses the contribution of two verification techniques applied to the onboard data handling software of space products. The first technique is model checking. The system is modeled by a set of timed automata and the verification of safety and liveness properties is performed using UPPAAL model checker. The verified model is then used to generate the embedded software. The second technique analyzed in this work is model based approach for the generation of test cases. The Conformance and Fault Injection (CoFI) testing methodology is used to guide the development of a set of Finite State Machine (FSM) models from the software specification. The test suite is automatically generated from the FSM models. The contributions of the two methodologies are analyzed based on the results provided by an experiment. Two software products are used as case study, each one implementing two services of the Packet Utilization Standard (PUS). These services represent the functionalities offered by a satellite onboard data handling computer. One of the products is developed with the aid of model checking, while the other is developed according to the practices currently used at the Instituto Nacional de Pesquisas Espaciais (INPE). Both software products are tested by the CoFI methodology. The experiment highlights the advantages and vulnerable points of model checking. It also demonstrates that the main contribution of CoFI testing methodology is to highlight problems related to situations that have not been considered in the software specification, such as the occurrence of inopportune events. This analysis helps to understand how different techniques can be integrated in the design of critical embedded software.

Keywords

Verification Model checking Model based testing Embedded software Packet Utilization Standard (PUS) Space application 

Notes

Acknowledgements

The authors would like to thank the collaboration and support of Fabrício Kucinskis, Ronaldo Arias and Thiago Pereira of the Aerospace Electronics Department (DEA) of INPE. The authors would also like to thank the financial support of the project Sistemas Inerciais para Aplicações Aeroespaciais (SIA), from the FINEP/CTA/INPE, and CNPq under grant 306259/2011-7.

References

  1. Agresti A (2007) An introduction to categorical data analysis, 2nd edn. Wiley, Hoboken, 372 pCrossRefMATHGoogle Scholar
  2. Alur R, Dill DA (1994) Theory of timed automata. Theor Comput Sci 126:183–235CrossRefMATHMathSciNetGoogle Scholar
  3. Ambrosio AM, Martins E, Vijaykumar NL, Carvalho SV (2006) A conformance testing process for space applications software services. J Aerosp Comput Inf Commun (JACIC), USA 3(4):146–158CrossRefGoogle Scholar
  4. Ambrosio AM, Francisco MFM, Martins E (2008) An independent software verification and validation process for space applications. Proceedings of the 9th Conference on Space Operations (SpaceOps). Heidelberg (Germany)Google Scholar
  5. Anjos JMS, Gripp J, Pontes RP, Villani E (2011) Applying the CoFI testing methodology to a multifunctional robot end-effector. Proceedings of the 5th Latin-American Symposium on Dependable Computing (LADC)—Industrial Track, São José dos Campos (Brasil)Google Scholar
  6. Arias R, Kucinskis FN, Alonso JDD (2008) Lessons learned from an onboard ECSS PUS object-oriented implementation. Proceedings of the 9th Conference on Space Operations (SpaceOps). Heildeberg, GermanyGoogle Scholar
  7. Behrmann G et al (2005) A Tutorial on UPPAAL. Proceedings of the 4th International School on Formal Methods for the Design of Computer, Communication, and Software Systems (SFM-RT“04). LNCS v. 3185Google Scholar
  8. Chan W, Anderson RJ, Beame P, Burns S, Modugno F, Notkin D, Reese JD (1998) Model checking large software specifications corrections. IEEE Transactions on Software Engineering, vol.24, no. 7, pp 498–520Google Scholar
  9. Cofer D, Whalen M, Miller S, Collins R (2008) Software model checking for avionics systems. Proceedings of 27th digital Avionics Systems Conference. ISSN: 0-7803-7367-7/02, pp 5.D.5-1–5.D.5-8Google Scholar
  10. Conrad M, Fey I, Sadeghipour S (2005) Systematic model-based testing of embedded automotive software. Electron Notes Theor Comput Sci 111:13–26CrossRefGoogle Scholar
  11. Dias-Neto AC, Travassos GH (2009) Model-based testing approaches selection for software projects. Inf Softw Technol 51:1487–1504CrossRefGoogle Scholar
  12. Dorofeeva R, El-Fakih K, Maag S, Cavalli AR, Yevtushenko N (2010) FSM-based conformance testing methods: a survey annotated with experimental evaluation. Inf Softw Technol 52:1286–1297CrossRefGoogle Scholar
  13. ECSS (2003) Ground systems and operations—telemetry and telecommand and packet utilization. ECSS-E-70-41A, ESA PublicationsGoogle Scholar
  14. ECSS (2008) Description, implementation and general requirements. ECSS-S-ST-00C, ESA PublicationsGoogle Scholar
  15. García F, Sanchez A (2006) Formal verification of safety and liveness properties for logic controllers. A tool comparison. Proceedings of 3rd International Conference on Electrical and Electronics Engineering, pp 1–3Google Scholar
  16. Glück PR, Holzmann GJ (2001) Using SPIN model checking for flight software verification. Proceedings of the Aerospace Conference, pp 105–113Google Scholar
  17. Güleşir G, van den Berg K, Bergmans L, Akşit M (2009) Experiment evaluation of a tool for the verification and transformation of source code in event-driven systems. Empir Softw Eng 14(6):720–777CrossRefGoogle Scholar
  18. Hendriks M, Verhoef M (2006) Timed automata based analysis of embedded system architectures. Proceedings of the 20th Int. Parallel and Distributed Processing SymposiumGoogle Scholar
  19. Leveson N (2005) Role of software in spacecraft accidents. J Spacecrafts Rockets 41(4):564–575CrossRefGoogle Scholar
  20. Lions JL (1996) Ariane 501 Inquiry Board report. Press Release Nº 33, ESA, 1996. Available at: <http://esamultimedia.esa.int/docs/esa-x-1819eng.pdf>. Accessed in: October 3rd 2010
  21. Martins E, Sabião SB, Ambrosio AM (1999) ConData: a tool for automating specification-based test case generation for communication systems. Softw Qual J 8(4):303–319CrossRefGoogle Scholar
  22. JK Microsystems (2009) eRTOS. Available at: <http://www.jkmicro.com/products/ertos.html>. Accessed in: May 3rd 2009
  23. Morais MHE, Ambrosio AM (2010) A new model-based approach for analysis and refinement of requirement specification to space operations. Proceedings of the 10th Conference on Space Operations (SpaceOps). Huntsville (Alabama, USA)Google Scholar
  24. Notebaert O (2006) Benefits of the standardization efforts for onboard data interfaces and services. Proceedings of the 8th Conference on Space Operations (SpaceOps). Rome, ItalyGoogle Scholar
  25. Ogawa H, Kumeno F, Honiden S (2008) Model checking process with goal oriented requirement analysis. Proceedings of 15th Asian-Pacific Software Engineering Conference, pp 377–384Google Scholar
  26. Paradkar A (2004) Towards model-based generation of self-priming and self-checking conformance tests for interactive systems. Inf Softw Technol 46:315–322CrossRefGoogle Scholar
  27. Partharasathy G et al (2003) A comparison of BDDs, BMC, and sequential SAT for model checking. Proc. of the 8th IEEE International Workshop on High-Level Design Validation and Test Workshop, pp 157–163Google Scholar
  28. Pingree PJ, Mikk E, Holzmann GJ, Smith MH, Dams D (2002) Validation of mission critical software design and implementation using model checking. Proceedings of the 21st Digital Avionics Systems Conference, p 6.A.4-1Google Scholar
  29. Pontes RP (2011) Contribuições do Model Checking e da Metodologia CoFI para o software embarcado espacial. 257 f. Master Thesis, Instituto Tecnológico de Aeronáutica (ITA), São José dos Campos-SPGoogle Scholar
  30. Pontes RP, Essado M, Véras PC, Ambrosio AM, Villani E (2009) A comparative analysis of two verification techniques for DEDS: model checking versus model-based testing. Proceedings of 4th IFAC Workshop on Discrete Event System Design (DEDes), pp 70–75, Valencia (Spain)Google Scholar
  31. Pontes RP, Essado M, Véras PC, Ambrosio AM, Villani E (2009) Model based refinement of requirement specification: a comparison of two V&V approaches. Proceedings of 20th International Congress of Mechanical Engineering, Gramado (RS, Brazil)Google Scholar
  32. Pontes RP, Martins E, Ambrosio AM, Villani E (2010) Embedded critical software testing for aerospace applications based on PUS. Proceedings of XXVIII Simpósio Brasileiro de Redes de Computadores e Sistemas Distribuídos (SBRC)/XI Workshop de Testes e Tolerância a Falhas (WTF), Gramado—RS, BrazilGoogle Scholar
  33. Prenninger W, Pretschner A (2005) Abstractions for model-based testing. Electron Notes Theor Comput Sci 116:59–71CrossRefGoogle Scholar
  34. Romero EL et al (2005) Comparing two testbench methods for hierarchical functional verification of a bluetooth baseband adaptor. Proc. of the 3rd IEEE/ACM/IFIP Int. Conf. on Hardware/Software Codesign and System Synthesis, pp 327–332Google Scholar
  35. Rozier KY (2010) Linear temporal logic symbolic model checking. Comput Sci Rev. doi: 10.1016/j.cosrev.2010.06.002
  36. Schuele T, Schneider K (2004) Global vs. local model checking: a comparison of verification techniques for infinite state systems. Proceedings of the 2nd International Conference on Software Engineering and Formal MethodsGoogle Scholar
  37. Seveg E et al (2004) Evaluating and comparing simulation verification vs. formal verification approach on block level design. Proc of the 11th IEEE Int Conf on Electronics, Circuits and Systems (ICECS), pp 515–518Google Scholar
  38. Sreemani T, Atlee JM (1996) Feasibility of model checking software requirements: a case study. Proceedings of the Eleventh Annual Conference on Computer Assurance (COMPASS), Gaithersburg, MD, USA, pp 77–88Google Scholar
  39. Tafazoli M (2009) A study of on-orbit spacecraft failures. Acta Astronautica 64:195–205Google Scholar
  40. Tiwari A, Sinha P, Ramachandran U (2003) On the run-time verification of autonomy software. Proceedings of the 28th Annual NASA Goddard Software Engineering WorkshopGoogle Scholar
  41. Wong WE, Debroy V, Restrepo A (2009) The role of software in recent catastrophic accidents. IEEE Reliability Society 2009 Annual Technology ReportGoogle Scholar
  42. Véras PC, Villani E, Ambrósio AM, Pontes RP, Vieira M, Madeira H (2010) Benchmarking software requirements documentation for space application. Proceedings of the 29th International Conference on Computer Safety, Reliability and Security (SAFECOMP), VienaGoogle Scholar
  43. Wohlin C et al (2000) Experimentation in software engineering: an introduction. Kluwer, Dordrecht, 204pCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  • Rodrigo Pastl Pontes
    • 1
  • Paulo Claudino Véras
    • 1
  • Ana Maria Ambrosio
    • 2
  • Emília Villani
    • 1
  1. 1.Instituto Tecnológico de Aeronáutica (ITA)São José dos CamposBrazil
  2. 2.Instituto Nacional de Pesquisas Espaciais (INPE)São José dos CamposBrazil

Personalised recommendations