Advertisement

Reinforced Condition/Decision Coverage (RC/DC): A New Criterion for Software Testing

  • Sergiy A. Vilkomir
  • Jonathan P. Bowen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2272)

Abstract

A new Reinforced Condition/Decision Coverage (RC/DC) criterion for software testing is proposed. This criterion provides further development of the well-known Modified Condition/Decision Coverage (MC/DC) criterion and is more suitable for testing of safety-critical software. Formal definitions in the Z notation for RC/DC, as well as MC/DC, are presented. Specific examples of using of these criteria are considered and some features are formally proved.

Keywords

Software Testing Coverage Criterion Decision Coverage Computer Control System Main Track 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bowen, J. P. Z:Aformal specification notation. In M. Frappier and H. Habrias (eds.), Software Specification Methods: An Overview Using a Case Study, Chapter 1. Springer-Verlag, FACIT series, 2001, pp. 3–19.Google Scholar
  2. 2.
    Burton, S. Towards Automated Unit Testing of Statechart Implementations. Technical Report YCS319, Department of Computer Science, University of York, UK. September 1999.Google Scholar
  3. 3.
    Chapman, R. Industrial Experience with SPARK. Proceedings of ACM SIGAda Annual International Conference (SIGAda 2000), November 12–16, 2000, Johns Hopkins University/Applied Physics Laboratory, Laurel, MD, USA.Google Scholar
  4. 4.
    Chilenski, J. and Miller, S. Applicability of modified condition/decision coverage to software testing. Software Engineering Journal, September 1994, pp. 193–200.Google Scholar
  5. 5.
    Chilenski, J. and Newcomb, P. H. Formal specification tool for test coverage analysis. Proceedings of the Ninth Knowledge-Based Software Engineering Conference, September 20-23, 1994, pp. 59–68.Google Scholar
  6. 6.
    DeWalt, M. MCDC. A blistering love/hate relationship. FAA National Software Conference, Long Beach, CA, USA, April 6–9, 1999.Google Scholar
  7. 7.
    Dolman, B. Definition of Statement Coverage, Decision Coverage and Modified Condition Decision Coverage. WG-52/SC-190 Discussion paper. Paper reference: D004, revision 1. Draft, September 25, 2000.Google Scholar
  8. 8.
    Dupuy, A. and Leveson, N. An empirical evaluation of the MC/DC coverage criterion on the HETE-2 satellite software. Proceedings of the Digital Aviation Systems Conference (DASC), Philadelphia, USA, October 2000.Google Scholar
  9. 9.
    Hayhurst, K. J., Veerhusen, D. S., Chilenski, J. J., and Rierson, L. K. A Practical Tutorial on Modified Condition/Decision Coverage, Report NASA/TM-2001-210876, NASA, USA, May 2001.Google Scholar
  10. 10.
    Jasper, R., Brennan, M., Williamson, K., Currier, B., and Zimmerman, D. Test data generation and feasible path analysis. Proceedings of the 1994 International Symposium on Software Testing and Analysis, Seattle,WA, USA, August 17–19, 1994, pp. 95–107.Google Scholar
  11. 11.
    Jia, X. ZTC:AType Checker for Z Notation. User’s Guide.Version 2.03, August 1998. Division of Software Engineering, School of Computer Science, Telecommunication, and Information Systems, DePaul University, USA, 1998.Google Scholar
  12. 12.
    Li, Y. Y. Structural test cases analysis and implementation. 42nd Midwest Symposium on Circuits and Systems, 8-11 August, 1999, Volume 2, pp. 882–885.Google Scholar
  13. 13.
    Myers, G. The Art of Software Testing.Wiley-Interscience, 1979.Google Scholar
  14. 14.
    Offutt, A. J., Xiong, Y., and Liu, S. Criteria for generating specification-based tests. Proceedings of the Fifth IEEE International Conference on Engineering of Complex Computer Systems (ICECCS’99), Las Vegas, Nevada, USA, October 18–21, 1999, pp. 119–129.Google Scholar
  15. 15.
    Roper, M. Software Testing. McGraw-Hill, 1994.Google Scholar
  16. 16.
    RTCA/DO-178B. Software Considerations in Airborne Systems and Equipment Certification, RTCA, Washington DC, USA, 1992.Google Scholar
  17. 17.
    Vilkomir, S. A. and Bowen, J. P. Formalization of Control-flow Criteria of Software Testing. Technical Report SBU-CISM-01-01, SCISM, South Bank University, London, UK, January 2001.Google Scholar
  18. 18.
    Vilkomir, S. A. and Bowen, J. P. Application of Formal Methods for Establishing Regulatory Requirements for Safety-Critical Software of Real-Time Control Systems. Technical Report SBU-CISM-01-03, SCISM, South Bank University, London, UK, 2001.Google Scholar
  19. 19.
    Vilkomir, S. A. and Bowen, J. P. Formalization of software testing criteria using the Z notation, Proceedings of COMPSAC 2001: 25th IEEE Annual International Computer Software and Applications Conference, Chicago, Illinois, USA, 8–12 October 2001. IEEE Computer Society Press, 2001, pp. 351–356.Google Scholar
  20. 20.
    Vilkomir, S. A. and Kharchenko, V. S. Methodology of the Review of Software for Safety Important Systems. Safety and Reliability. Proceedings of ESREL’99-The Tenth European Conference on Safety and Reliability, Munich-Garching, Germany, 13–17 September 1999, Vol. 1, pp. 593–596.Google Scholar
  21. 21.
    Voznessensky, V. and Berkovich, V. VVER 440 and VVER-1000. Design Features in Comparison with Western PWRS. International Conference on Design and Safety of Advanced Nuclear Power Plants, Tokyo, October 1992, Vol. 4.Google Scholar
  22. 22.
    Zhu, H., Hall P. A., and May, H. R. Software unit test coverage and adequacy. ACMComputing Surveys, Vol. 29, No. 4, December 1997, pp. 336–427.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Sergiy A. Vilkomir
    • 1
  • Jonathan P. Bowen
    • 1
  1. 1.South Bank University, Centre for Applied Formal Methods School of ComputingInformation Systems and MathematicsLondonUK

Personalised recommendations