Advertisement

Are We There Yet? Determining the Adequacy of Formalized Requirements and Test Suites

  • Anitha MurugesanEmail author
  • Michael W. Whalen
  • Neha Rungta
  • Oksana Tkachuk
  • Suzette Person
  • Mats P. E. Heimdahl
  • Dongjiang You
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9058)

Abstract

Structural coverage metrics have traditionally categorized code as either covered or uncovered. Recent work presents a stronger notion of coverage, checked coverage, which counts only statements whose execution contributes to an outcome checked by an oracle. While this notion of coverage addresses the adequacy of the oracle, for Model-Based Development of safety critical systems, it is still not enough; we are also interested in how much of the oracle is covered, and whether the values of program variables are masked when the oracle is evaluated. Such information can help system engineers identify missing requirements as well as missing test cases. In this work, we combine results from checked coverage with results from requirements coverage to help provide insight to engineers as to whether the requirements or the test suite need to be improved. We implement a dynamic backward slicing technique and evaluate it on several systems developed in Simulink. The results of our preliminary study show that even for systems with comprehensive test suites and good sets of requirements, our approach can identify cases where more tests or more requirements are needed to improve coverage numbers.

Keywords

Test Suite Execution Trace Test Case Generation Coverage Metrics Code Coverage 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
  2. 2.
  3. 3.
    Ammann, P., Delamaro, M.E., Offutt, J.: Establishing theoretical minimal sets of mutants. In: Proceedings of the 2014 IEEE International Conference on Software Testing, Verification, and Validation, IEEE Computer Society Washington, DC, USA (2014)Google Scholar
  4. 4.
    Baudin, P., Filliâtre, J.-C., Claude, M., Benjamin, M., Moy, Y., Virgile, P., Île-de France, I.S.: ANSI/ISO C specification language, ACSL (2008)Google Scholar
  5. 5.
    Beer, I., Ben-David, S., Eisner, C., Rodeh, Y.: Efficient detection of vacuity in ACTL formulas. In: Formal Methods in System Design, pp. 141–162 (2001)Google Scholar
  6. 6.
    Clause, J., Li, W., Orso, A.: Dytan: a generic dynamic taint analysis framework. In: Proceedings of the 2007 Int’l Symposium on Software Testing and Analysis, pp. 196–206 (2007)Google Scholar
  7. 7.
    Cuoq, P., Kirchner, F., Kosmatov, N., Prevosto, V., Signoles, J., Yakobowski, B.: Frama-c. In: Software Engineering and Formal Methods, pp. 233–247. Springer (2012)Google Scholar
  8. 8.
    DeMillo, R.A., Lipton, R.J., Sayward, F.G.: Hints on test data selection: Help for the practicing programmer. Computer 11(4), 34–41 (1978)CrossRefGoogle Scholar
  9. 9.
    Devadas, S., Ghosh, A., Keutzer, K.: An observability-based code coverage metric for functional simulation. In: Proceedings of the 1996 IEEE/ACM Int’l Conf. on Computer-Aided Design, pp. 418–425 (1996)Google Scholar
  10. 10.
    Esterel-Technologies. SCADE Suite product description. http://www.esterel-technologies.com/v2/scadeSuiteForSafetyCriticalSoftwareDe
  11. 11.
    Fallah, F., Devadas, S., Keutzer, K.: OCCOM-efficient computation of observability-based code coverage metrics for functional verification. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 20(8), 1003–1015 (2001)CrossRefGoogle Scholar
  12. 12.
    Fraser, G., Staats, M., McMinn, P., Arcuri, A., Padberg, F.: Does automated white-box test generation really help software testers? In: ISSTA 2013 Proceedings of the 2013 International Symposium on Software Testing and Analysis, pp. 291–301. ACM, New York, NY, USA (2013)Google Scholar
  13. 13.
    Gacek, A.: JKind - a Java implementation of the KIND model checker. https://github.com/agacek
  14. 14.
    Gay, G., Staats, M., Whalen, M.W., Heimdahl, M.P.E.: Moving the goalposts: coverage satisfaction is not enough. In: Proceedings of the 7th International Workshop on Search-Based Software Testing, ACM, New York, NY, USA (2014)Google Scholar
  15. 15.
    Korel, B., Laski, J.: Dynamic program slicing. Information Processing Letters 29(3), 155–163 (1988)CrossRefzbMATHGoogle Scholar
  16. 16.
    Kupferman, O., Vardi, M.Y.: Vacuity detection in temporal model checking. Journal on Software Tools for Technology Transfer 4(2), February 2003Google Scholar
  17. 17.
    GNUGPL License. Gcov: Gnu coverage tool. https://gcc.gnu.org
  18. 18.
    Masri, W., Podgurski, A., Leon, D.: Detecting and debugging insecure information flows. In: Proceedings of the 15th Int’l Symposium on Software Reliability Engineering, pp. 198–209 (2004)Google Scholar
  19. 19.
  20. 20.
    MathWorks Inc., Simulink Coder. http://www.mathworks.com/products/simulink-coder/
  21. 21.
    MathWorks Inc., Simulink Design Verifier. http://www.mathworks.com/products/sldesignverifier
  22. 22.
    Murugesan, A., Whalen, M.W., Rayadurgam, S., Heimdahl, M.P.E.: Compositional verification of a medical device system. In: ACM Int’l Conf. on High Integrity Language Technology (HILT) 2013. ACM, November 2013Google Scholar
  23. 23.
    Offutt, A.J., Untch, R.H.: Mutation testing for the new century. chapter Mutation 2000: Uniting the Orthogonal, pp. 34–44. Kluwer Academic Publishers, Norwell, MA, USA (2001)Google Scholar
  24. 24.
    Pecheur, C., Raimondi, F., Brat, G.: A formal analysis of requirements-based testing. In: Proceedings of the Eighteenth International Symposium on Software Testing and Analysis, pp. 47–56. ACM (2009)Google Scholar
  25. 25.
    Purandare, M., Somenzi, F.: Vacuum cleaning CTL formulae. In: Brinksma, E., Larsen, K.G. (eds.) CAV 2002. LNCS, vol. 2404, p. 485. Springer, Heidelberg (2002) CrossRefGoogle Scholar
  26. 26.
  27. 27.
    RTCA/DO-178C. Software considerations in airborne systems and equipment certificationGoogle Scholar
  28. 28.
    Rungta, N., Tkachuk, O., Person, S., Biatek, J., Whalen, M.W., Castle, J., Gundy-Burlet, K.: Helping system engineers bridge the peaks. In: TwinPeaks 2014 Proceedings of the 4th International Workshop on Twin Peaks of Requirements and Architecture, pp. 9–13. ACM, New York, NY, USA, (2014)Google Scholar
  29. 29.
    Schuler, D., Zeller, A.: Assessing oracle quality with checked coverage. In: ICST 2011 Proceedings of the 2011 Fourth IEEE International Conference on Software Testing, Verification and Validation, pp. 90–99. IEEE Computer Society, Washington, DC, USA (2011)Google Scholar
  30. 30.
    Schuler, D., Zeller, A.: Checked coverage: an indicator for oracle quality. Software: Testing, Verification and Reliability 23(7), 531–551 (2013)Google Scholar
  31. 31.
    Staats, M., Gay, G., Whalen, M.W., Heimdahl, M.P.E.: On the danger of coverage directed test case generation. In: 15th Int’l Conf. on Fundamental Approaches to Software Engineering (FASE), April 2012Google Scholar
  32. 32.
    Weiser, M.: Program slicing. IEEE Transactions on Software Engineering 10(4), 352–357 (1984)CrossRefzbMATHGoogle Scholar
  33. 33.
    Whalen, M., Gay, G., You, D., Heimdahl, M.P.E., Staats, M.: Observable modified condition/decision coverage. In: Proceedings of the 2013 Int’l Conf. on Software Engineering. ACM, May 2013Google Scholar
  34. 34.
    Whalen, M.W., Cofer, D.D., Miller, S.P., Krogh, B.H., Storm, W.: Integration of Formal Analysis into a Model-Based Software Development Process. In: Leue, S., Merino, P. (eds.) Formal Methods for Industrial Critical Systems. LNCS, vol. 4916, pp. 68–84. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  35. 35.
    Whalen, M.W., Greve, D.A., Wagner, L.G.: Model Checking Information Flow. Springer-Verlag, Berlin Germany (2010)CrossRefGoogle Scholar
  36. 36.
    Whalen, M.W., Rajan, A., Heimdahl, M.P.E.: Coverage metrics for requirements-based testing. In: Proceedings of Int’l Symposium on Software Testing and Analysis, pp. 25–36. ACM, July 2006Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Anitha Murugesan
    • 1
    Email author
  • Michael W. Whalen
    • 1
  • Neha Rungta
    • 2
  • Oksana Tkachuk
    • 2
  • Suzette Person
    • 3
  • Mats P. E. Heimdahl
    • 1
  • Dongjiang You
    • 1
  1. 1.Department of Computer Science and EngineeringUniversity of MinnesotaMinneapolisUSA
  2. 2.NASA Ames Research CenterMountainUSA
  3. 3.NASA Langley Research CenterHamptonUSA

Personalised recommendations