Localizing Program Errors for Cimple Debugging

  • Samik Basu
  • Diptikalyan Saha
  • Scott A. Smolka
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3235)


We present automated techniques for the explanation of counter-examples, where a counter-example should be understood as a sequence of program statements. Our approach is based on variable dependency analysis and is applicable to programs written in Cimple, an expressive subset of the C programming language. Central to our approach is the derivation of a focus-statement sequence (FSS) from a given counter-example: a subsequence of the counter-example containing only those program statements that directly or indirectly affect the variable valuation leading to the program error behind the counter-example. We develop a ranking procedure for FSSs where FSSs of higher rank are conceptually easier to understand and correct than those of lower rank. We also analyze constraints over uninitialized variables in order to localize program errors to specific program segments; this often allows the user to subsequently take appropriate debugging measures. We have implemented our techniques in the FocusCheck model checker, which efficiently checks for assertion violations in Cimple programs on a per-procedure basis. The practical utility of our approach is illustrated by its successful application to a fast, linear-time median identification algorithm commonly used in statistical analysis and in the Resolution Advisory module of the Traffic Collision Avoidance System.


Model Checker Conditional Statement Reachability Analysis Conditional Expression Focus Statement 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Ball, T., Naik, M., Rajamani, S.K.: From symptom to cause: Localizing error in counterexample traces. In: Proceedings of POPL (2003)Google Scholar
  2. 2.
    Ball, T., Podelski, A., Rajamani, S.K.: Relative completeness of abstraction refinement for software model checking. In: Katoen, J.-P., Stevens, P. (eds.) TACAS 2002. LNCS, vol. 2280, p. 158. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  3. 3.
    Ball, T., Rajamani, S.K.: Bebop: A symbolic model checker for boolean programs. In: Proceedings of SPIN Workshop (2000)Google Scholar
  4. 4.
    Ball, T., Rajamani, S.K.: Slam (2003),
  5. 5.
    Basu, S., Saha, D., Smolka, S.A.: Getting to the root of the problem: Focus statements for the analysis of counter-examples. Technical report, SUNYSB (2004)Google Scholar
  6. 6.
    BLAST. Berkeley lazy abstraction software verification tool (2003),
  7. 7.
    Clarke, E.M., Emerson, E.A., Sistla, A.P.: Automatic verification of finitestate concurrent systems using temporal logic specifications. ACM TOPLAS 8(2) (1986)Google Scholar
  8. 8.
    Corman, T.H., Leiserson, C.E., Rivest, R.L.: Introduction to Algorithm. MIT Press, Cambridge (1990)Google Scholar
  9. 9.
    Dwyer, M.B., Hatcliff, J.: Slicing software for model construction. In: Partial Evaluation and Semantic-Based Program Manipulation (1999)Google Scholar
  10. 10.
    Engler, D., Ashcraft, K.: RacerX: Effective, static detection of race conditions and deadlocks. In: Proceedings of SOSP (2003)Google Scholar
  11. 11.
    Esparza, J., Schwoon, S.: A BDD-based model checker for recursive programs. In: Berry, G., Comon, H., Finkel, A. (eds.) CAV 2001. LNCS, vol. 2102, p. 324. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  12. 12.
    Godefroid, P.: Model checking for programming languages using verisoft. In: Proceedings of POPL (1997)Google Scholar
  13. 13.
    Groce, A.: Error explanation with distance metrics. In: Jensen, K., Podelski, A. (eds.) TACAS 2004. LNCS, vol. 2988, pp. 108–122. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  14. 14.
    Groce, A., Visser, W.: What went wrong: Explaining counterexamples. In: Proceedings of SPIN Workshop on Model Checking of Software (2003)Google Scholar
  15. 15.
    Aristotle Research Group. Program analysis based software engineering (2003),
  16. 16.
    Hatcliff, J., Dwyer, M.: Using the bandera tool set to model-check properties of concurrent Java software. In: Larsen, K.G., Nielsen, M. (eds.) CONCUR 2001. LNCS, vol. 2154, p. 39. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  17. 17.
    Henzinger, T.A., Jhala, R., Majumdar, R., Sutre, G.: Lazy abstraction. In: Proceedings of POPL (2002)Google Scholar
  18. 18.
    Holzmann, G.J., Smith, M.H.: Software model checking: Extracting verification models from source code. In: Proceedings of FORTE (1999)Google Scholar
  19. 19.
    Horwitz, S., Reps, T., Binkley, D.: Interprocedural slicing using dependence graphs. In: Proceedings of PLDI (1988)Google Scholar
  20. 20.
    Jin, H., Ravi, K., Somenzi, F.: Fate and free will in error traces. In: Katoen, J.-P., Stevens, P. (eds.) TACAS 2002. LNCS, vol. 2280, p. 445. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  21. 21.
    MOPED. A model checker for pushdown systems (2003),
  22. 22.
    Queille, J.P., Sifakis, J.: Specification and verification of concurrent systems in Cesar. In: Proceedings of the ISP (1982)Google Scholar
  23. 23.
    RTCA. Minimum operational performance stardards for traffic alert and collision aviodance system (TCAS) airborne equipment consolidated edition (1990)Google Scholar
  24. 24.
    Rybalchenko, A.: A Model Checker based on Abstraction Refinement. PhD thesis, Universitt des Saarlandes (2002)Google Scholar
  25. 25.
    Sharygina, N., Peled, D.: A combined testing and verification approach for software reliability. In: Oliveira, J.N., Zave, P. (eds.) FME 2001. LNCS, vol. 2021, p. 611. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  26. 26.
    The XSB logic programming system (2003),

Copyright information

© IFIP International Federation for Information Processing 2004

Authors and Affiliations

  • Samik Basu
    • 1
  • Diptikalyan Saha
    • 2
  • Scott A. Smolka
    • 2
  1. 1.Department of Computer ScienceIowa State UniversityAmes
  2. 2.Department of Computer ScienceState University of New York at Stony BrookStony Brook

Personalised recommendations