Skip to main content

Is 100% Test Coverage a Reasonable Requirement? Lessons Learned from a Space Software Project

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 10611))

Abstract

To ensure the dependability and safety of spaceflight devices, rigorous standards are defined. Among others, one requirement from the European Cooperation for Space Standardization (ECSS) standards is 100% test coverage at software unit level. Different stakeholders need to have a good knowledge of the implications of such a requirement to avoid risks for the project that this requirement might entail. In this paper, we study if such a 100% test coverage requirement is a reasonable one. For this, we interviewed the industrial developers who ran a project that had the sole goal of achieving 100% unit test coverage in a spaceflight software. We discuss costs, benefits, risks, effects on quality, interplay with surrounding conditions, and project management implications. We distill lessons learned with which we hope to support other developers and decision makers when considering a 100% unit test coverage requirement.

This is a preview of subscription content, log in via an institution.

Notes

  1. 1.

    A Sol is a day on Mars, which is 24 h 37 min, while a day on Earth is 23 h 56 min. The time unit Sol is used to run Mars operations and to not have the demand of continuously converting time.

  2. 2.

    Estimates mention more than 500,000 pieces of junk, so-called ‘space debris’, orbiting Earth at high speeds of dozens of km/s [17]. Due to their extreme speeds, the kinetic energy of even small particles of only a few millimeters can cause impact craters of several dozen centimeters on the spacecraft, and lead to fatal and catastrophic effects like disintegration of the target.

  3. 3.

    The ECSS standards define four levels for criticality from A to D (ECSS-Q-ST-30C [7]). For instance, criticality class A comprises catastrophic events, e.g., loss of life, launch site facilities, or the entire spacecraft. Class B is for the risk of losing the ability to perform the mission (loss of mission), and Class C for a major mission degradation. The LCT system, which is the subject of this paper (see Sect. 2) is classified as B (system), and its software as C.

  4. 4.

    The product assurance process performed so far includes several parties and procedures. The device manufacturer’s product assurance reports to and is supervised by the customer’s product assurance (cf. [19]). Further involved on satellite-level are the customer’s and the prime contractor’s product assurance. At the technical level V&V activities include, inter alia, static analyses, verification controls’ and reviews. At device level, separate test teams carry out software tests in isolation and as part of the integrated device prior to shipment for full integration and system testing.

  5. 5.

    Due to the sensitivity of the data, we only present excerpts and anonymized results.

  6. 6.

    This is also called the “Pareto principle”; according to Joseph M. Juran who proposed the 80/20-rule, which roughly says that the first 80% are easy to achieve while the remaining 20% are not.

  7. 7.

    A proportional-integral (PI) controller is a control loop feedback mechanism that continuously computes the difference between an expected and an actual value for a variable (e.g., temperature, electrical current flow, angles,...) and applies a correction based on proportional, integral, and derivative terms.

References

  1. Adler, M.: Spirit Sol 18 Anomaly, September 2006. http://web.archive.org/web/20110605095126/www.planetary.org/blog/article/00000702

  2. Arthur, L.J.: Quantum improvements in software system quality. Commun. ACM 40(6), 46–52 (1997)

    Article  Google Scholar 

  3. Bennett, T., Wennberg, P.: Eliminating embedded software defects prior to integration test. Qual. Assur. Inst. J. (2006)

    Google Scholar 

  4. Bentley, J.: Programming pearls. Commun. ACM 28(9), 896–901 (1985)

    Article  Google Scholar 

  5. Boehm, B., Basili, V.R.: Software defect reduction top 10 list. Computer 34(1), 135–137 (2001)

    Article  Google Scholar 

  6. ECSS-E-ST-40 Working Group: ECSS-E-ST-40C: Space engineering - Software. Standard, ECSS Secretariat, March 2009

    Google Scholar 

  7. ECSS-Q-ST-30 Working Group: ECSS-Q-ST-30C: Space product assurance - Dependability. Standard, ECSS Secretariat, March 2009

    Google Scholar 

  8. ECSS-Q-ST-80C Working Group: ECSS-Q-ST-80C: Space product assurance - Software product assurance. Standard, ECSS Secretariat, March 2009

    Google Scholar 

  9. ESA: Sentinel online (2017). https://sentinel.esa.int

  10. Fucci, D., Erdogmus, H., Turhan, B., Oivo, M., Juristo, N.: A dissection of test-driven development: does it really matter to test-first or to test-last? IEEE Trans. Softw. Eng. (2017, in Press)

    Google Scholar 

  11. Garousi, V., Felderer, M.: Worlds apart: a comparison of industry and academic focus areas in software testing. IEEE Softw. (2017, in press)

    Google Scholar 

  12. Gokhale, S.S., Mullen, R.E.: The marginal value of increased testing: an empirical analysis using four code coverage measures. J. Braz. Comput. Soc. 12(3), 13–30 (2006)

    Article  Google Scholar 

  13. Ingibergsson, J.T.M., Schultz, U.P., Kuhrmann, M.: On the use of safety certification practices in autonomous field robot software development: a systematic mapping study. In: Abrahamsson, P., Corral, L., Oivo, M., Russo, B. (eds.) PROFES 2015. LNCS, vol. 9459, pp. 335–352. Springer, Cham (2015). doi:10.1007/978-3-319-26844-6_25

    Chapter  Google Scholar 

  14. Marick, B.: How to misuse code coverage. In: Proceedings of the 16th International Conference on Testing Computer Software, pp. 16–18 (1999)

    Google Scholar 

  15. Martin, R.C.: The Clean Coder: A Code of Conduct for Professional Programmers. Pearson Education, Upper Saddle River (2011)

    Google Scholar 

  16. Mockus, A., Nagappan, N., Dinh-Trong, T.T.: Test coverage and post-verification defects: a multiple case study. In: 2009 3rd International Symposium on Empirical Software Engineering and Measurement, pp. 291–301, October 2009

    Google Scholar 

  17. NASA: NASA Missions: Space Station, September 2013. https://www.nasa.gov/mission_pages/station/news/orbital_debris.html

  18. NBC News: German satellite crashed over Asia’s Bay of Bengal, October 2011. http://www.nbcnews.com/id/45032034/ns/technology_and_science-space

  19. Prause, C.R., Bibus, M., Dietrich, C., Jobi, W.: Managing software process evolution for spacecraft from a customer’s perspective. In: Kuhrmann, M., Münch, J., Richardson, I., Rausch, A., Zhang, H. (eds.) Managing Software Process Evolution: Traditional, Agile and Beyond – How to Handle Process Change, pp. 137–163. Springer, Cham (2016). doi:10.1007/978-3-319-31545-4_8

    Chapter  Google Scholar 

  20. Tolker-Nielsen, T.: EXOMARS 2016 - Schiaparelli anomaly inquiry. Report DG-I/2017/546/TTN, European Space Agency (ESA), May 2017

    Google Scholar 

  21. Vector Software, Inc.: Software testing technology report, p. 2016. Technical report, Vector Software, September 2016

    Google Scholar 

  22. Witze, A.: Software error doomed Japanese Hitomi spacecraft. Nature 533, 18–19 (2016)

    Article  Google Scholar 

  23. Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation in Software Engineering. Springer, Heidelberg (2012). doi:10.1007/978-3-642-29044-2

    Book  MATH  Google Scholar 

Download references

Acknowledgements

We thank our colleagues Karin Schmitz for transcribing the several hours of recorded interviews, and Björn Gütlich and Sabine Philipp-May for supporting our undertaking.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian R. Prause .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Prause, C.R., Werner, J., Hornig, K., Bosecker, S., Kuhrmann, M. (2017). Is 100% Test Coverage a Reasonable Requirement? Lessons Learned from a Space Software Project. In: Felderer, M., Méndez Fernández, D., Turhan, B., Kalinowski, M., Sarro, F., Winkler, D. (eds) Product-Focused Software Process Improvement. PROFES 2017. Lecture Notes in Computer Science(), vol 10611. Springer, Cham. https://doi.org/10.1007/978-3-319-69926-4_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-69926-4_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-69925-7

  • Online ISBN: 978-3-319-69926-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics