Symstra: A Framework for Generating Object-Oriented Unit Tests Using Symbolic Execution

  • Tao Xie
  • Darko Marinov
  • Wolfram Schulte
  • David Notkin
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3440)


Object-oriented unit tests consist of sequences of method invocations. Behavior of an invocation depends on the method’s arguments and the state of the receiver at the beginning of the invocation. Correspondingly, generating unit tests involves two tasks: generating method sequences that build relevant receiver-object states and generating relevant method arguments. This paper proposes Symstra, a framework that achieves both test generation tasks using symbolic execution of method sequences with symbolic arguments. The paper defines symbolic states of object-oriented programs and novel comparisons of states. Given a set of methods from the class under test and a bound on the length of sequences, Symstra systematically explores the object-state space of the class and prunes this exploration based on the state comparisons. Experimental results show that Symstra generates unit tests that achieve higher branch coverage faster than the existing test-generation techniques based on concrete method arguments.


Model Check Symbolic Execution Symbolic Expression Symbolic State Java Modelling Language 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Andrews, T., Qadeer, S., Rajamani, S.K., Rehof, J., Xie, Y.: Zing: A model checker for concurrent software. In: Alur, R., Peled, D.A. (eds.) CAV 2004. LNCS, vol. 3114, pp. 484–487. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  2. 2.
    Ball, T.: A theory of predicate-complete test coverage and generation. Technical Report MSR-TR-2004-28, Microsoft Research, Redmond, WA (April 2004)Google Scholar
  3. 3.
    Ball, T., Larus, J.R.: Using paths to measure, explain, and enhance program behavior. IEEE Computer 33(7), 57–65 (2000)Google Scholar
  4. 4.
    Ball, T., Majumdar, R., Millstein, T., Rajamani, S.K.: Automatic predicate abstraction of C programs. In: Proc. the ACM SIGPLAN 2001 Conference on Programming Language Design and Implementation, pp. 203–213 (2001)Google Scholar
  5. 5.
    Beck, K.: Extreme programming explained. Addison-Wesley, Reading (2000)Google Scholar
  6. 6.
    Beizer, B.: Software Testing Techniques. International Thomson Computer Press (1990)Google Scholar
  7. 7.
    Beyer, D., Chlipala, A.J., Henzinger, T.A., Jhala, R., Majumdar, R.: Generating tests from counterexamples. In: Proc. 26th International Conference on Software Engineering, pp. 326–335 (2004)Google Scholar
  8. 8.
    Boyapati, C., Khurshid, S., Marinov, D.: Korat: automated testing based on Java predicates. In: Proc. International Symposium on Software Testing and Analysis, pp. 123–133 (2002)Google Scholar
  9. 9.
    Bush, W.R., Pincus, J.D., Sielaff, D.J.: A static analyzer for finding dynamic programming errors. Softw. Pract. Exper. 30(7), 775–802 (2000)zbMATHCrossRefGoogle Scholar
  10. 10.
    Cheon, Y., Leavens, G.T.: A simple and practical approach to unit testing: The JML and JUnit way. In: Proc. 16th European Conference Object-Oriented Programming, pp. 231–255 (June 2002)Google Scholar
  11. 11.
    Clark, S.B., Barrett, W.: CVC Lite: A new implementation of the cooperating validity checker. In: Alur, R., Peled, D.A. (eds.) CAV 2004. LNCS, vol. 3114, pp. 515–518. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  12. 12.
    Clarke, E.M., Grumberg, O., Peled, D.A.: Model Checking. The MIT Press, Cambridge (1999)Google Scholar
  13. 13.
    Csallner, C., Smaragdakis, Y.: JCrasher: an automatic robustness tester for Java. Software: Practice and Experience 34, 1025–1050 (2004)CrossRefGoogle Scholar
  14. 14.
    Detlefs, D., Nelson, G., Saxe, J.B.: Simplify: a theorem prover for program checking. Technical Report HPL-2003-148, HP Laboratories, Palo Alto, CA (2003)Google Scholar
  15. 15.
    Foundations of Software Engineering, Microsoft Research. The AsmL test generator tool,
  16. 16.
    Grieskamp, W., Gurevich, Y., Schulte, W., Veanes, M.: Generating finite state machines from abstract state machines. In: Proc. International Symposium on Software Testing and Analysis, pp. 112–122 (2002)Google Scholar
  17. 17.
    Henzinger, T.A., Jhala, R., Majumdar, R., Sutre, G.: Software verification with BLAST. In: Proc. 10th SPIN Workshop on Software Model Checking, pp. 235–239 (2003)Google Scholar
  18. 18.
    JUnit (2003),
  19. 19.
    Khurshid, S., Pasareanu, C.S., Visser, W.: Generalized symbolic execution for model checking and testing. In: Proc. 9th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, pp. 553–568 (April 2003)Google Scholar
  20. 20.
    King, J.C.: Symbolic execution and program testing. Commun. ACM 19(7), 385–394 (1976)zbMATHCrossRefGoogle Scholar
  21. 21.
    Leavens, G.T., Baker, A.L., Ruby, C.: Preliminary design of JML: A behavioral interface specification language for Java. Technical Report TR 98-06i, Department of Computer Science, Iowa State University (June 1998)Google Scholar
  22. 22.
    Legeard, B., Peureux, F., Utting, M.: A comparison of the LIFC/B and TTF/Z test-generation methods. In: Proc. 2nd International Z and B Conference, pp. 309–329 (January 2002)Google Scholar
  23. 23.
    Liskov, B., Guttag, J.: Program Development in Java: Abstraction, Specification, and Object-Oriented Design. Addison-Wesley, Reading (2000)Google Scholar
  24. 24.
    McMillan, K.L.: Symbolic Model Checking. Kluwer Academic Publishers, Dordrecht (1993)zbMATHGoogle Scholar
  25. 25.
    Microsoft Visual Studio Developer Center (2004),
  26. 26.
    Pacheco, C., Ernst, M.D.: Eclat documents. Online manual (Oct. 2004),
  27. 27.
    Parasoft. Jtest manuals version 5.1. Online manual (July 2004),
  28. 28.
    Pugh, W.: A practical algorithm for exact array dependence analysis. Commun. ACM 35(8), 102–114 (1992)CrossRefGoogle Scholar
  29. 29.
    Robby, Dwyer, M.B., Hatcliff, J.: Bogor: an extensible and highly-modular software model checking framework. In: Proc. 9th ESEC/FSE, pp. 267–276 (2003)Google Scholar
  30. 30.
    Robby, Dwyer, M.B., Hatcliff, J., Iosif, R.: Space-reduction strategies for model checking dynamic systems. In: Proc. 2003 Workshop on Software Model Checking (July 2003)Google Scholar
  31. 31.
    Schlenker, H., Ringwelski, G.: POOC: A platform for object-oriented constraint programming. In: Proc. 2002 International Workshop on Constraint Solving and Constraint Logic Programming, pp. 159–170 (June 2002)Google Scholar
  32. 32.
    Visser, W., Havelund, K., Brat, G., Park, S.: Model checking programs. In: Proc. 15th IEEE International Conference on Automated Software Engineering, pp. 3–12 (2000)Google Scholar
  33. 33.
    Visser, W., Pasareanu, C.S., Khurshid, S.: Test input generation with Java PathFinder. In: Proc. 2004 ACM SIGSOFT International Symposium on Software Testing and Analysis, pp. 97–107 (2004)Google Scholar
  34. 34.
    Xie, T., Marinov, D., Notkin, D.: Rostra: A framework for detecting redundant object-oriented unit tests. In: Proc. 19th IEEE International Conference on Automated Software Engineering, pp. 196–205 (September 2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Tao Xie
    • 1
  • Darko Marinov
    • 2
  • Wolfram Schulte
    • 3
  • David Notkin
    • 1
  1. 1.Dept. of Computer Science & EngineeringUniv. of WashingtonSeattleUSA
  2. 2.Department of Computer ScienceUniversity of IllinoisUrbana-ChampaignUSA
  3. 3.Microsoft ResearchRedmondUSA

Personalised recommendations