Optimized Execution of Deterministic Blocks in Java PathFinder

  • Marcelo d’Amorim
  • Ahmed Sobeih
  • Darko Marinov
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4260)


Java PathFinder (JPF) is an explicit-state model checker for Java programs. It explores all executions that a given program can have due to different thread interleavings and nondeterministic choices. JPF implements a backtracking Java Virtual Machine (JVM) that executes Java bytecodes using a special representation of JVM states. This special representation enables JPF to quickly store, restore, and compare states; it is crucial for making the overall state exploration efficient. However, this special representation creates overhead for each execution, even execution of deterministic blocks that have no thread interleavings or nondeterministic choices.

We propose mixed execution, a technique that reduces execution time of deterministic blocks in JPF. JPF is written in Java as a special JVM that runs on top of a regular, host JVM. mixed execution works by translating the state between the special JPF representation and the host JVM representation. We also present lazy translation, an optimization that speeds up mixed execution by translating only the parts of the state that a specific execution dynamically depends on. We evaluate mixed execution on six programs that use JPF for generating tests for data structures and on one case study for verifying a network protocol. The results show that mixed execution can improve the overall time for state exploration up to 36.98%, while improving the execution time of deterministic blocks up to 69.15%. Although we present mixed execution in the context of JPF and Java, it generalizes to any model checker that uses a special state representation.


Model Check State Exploration Java Virtual Machine RREQ Packet Nondeterministic Choice 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
  2. 2.
    Java Native Interface: Programmer’s Guide and Specification. Online book,
  3. 3.
    Ball, T., Rajamani, S.K.: Automatically validating temporal safety properties of interfaces. In: Dwyer, M.B. (ed.) SPIN 2001. LNCS, vol. 2057, pp. 103–122. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  4. 4.
    Bhargavan, K., Obradovic, D., Gunter, C.A.: Formal verification of standards for distance vector routing protocols. Journal of the ACM 49(4), 538–576 (2002)CrossRefMathSciNetGoogle Scholar
  5. 5.
    Boyapati, C., Khurshid, S., Marinov, D.: Korat: automated testing based on Java predicates. In: Proc. International Symposium on Software Testing and Analysis, pp. 123–133 (2002)Google Scholar
  6. 6.
    Clarke, E., Kroening, D., Lerda, F.: A tool for checking ANSI-C programs. In: Jensen, K., Podelski, A. (eds.) TACAS 2004. LNCS, vol. 2988, pp. 168–176. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  7. 7.
    Corbett, J.C., Dwyer, M.B., Hatcliff, J., Laubach, S., Pasareanu, C.S., Robby, Zheng, H.: Bandera: extracting finite-state models from Java source code. In: Proc. 22nd International Conference on Software Engineering, pp. 439–448 (2000)Google Scholar
  8. 8.
    Csallner, C., Smaragdakis, Y.: JCrasher: an automatic robustness tester for Java. Software: Practice and Experience 34, 1025–1050 (2004)CrossRefGoogle Scholar
  9. 9.
    Dill, D.L., Drexler, A.J., Hu, A.J., Yang, C.H.: Protocol verification as a hardware design aid. In: IEEE International Conference on Computer Design (IEEE ICCD), pp. 522–525 (1992)Google Scholar
  10. 10.
    Farzan, A., Chen, F., Meseguer, J., Rosu, G.: Formal analysis of Java programs in JavaFAN. In: Alur, R., Peled, D.A. (eds.) CAV 2004. LNCS, vol. 3114, pp. 501–505. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Flanagan, D.: Java In A Nutshell. O’Reilly, Sebastopol (1997)zbMATHGoogle Scholar
  12. 12.
    Foundations of Software Engineering at Microsoft Research. The AsmL test generator tool,
  13. 13.
    Godefroid, P.: Model checking for programming languages using Verisoft. In: Proc. 24th ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, pp. 174–186 (1997)Google Scholar
  14. 14.
    Havelund, K.: Java Pathfinder, a translator from Java to Promela. In: Dams, D.R., Gerth, R., Leue, S., Massink, M. (eds.) SPIN 1999. LNCS, vol. 1680, p. 152. Springer, Heidelberg (1999)CrossRefGoogle Scholar
  15. 15.
    Holzmann, G.J.: The model checker SPIN. IEEE Trans. on Software Engineering 23(5), 279–295 (1997)CrossRefMathSciNetGoogle Scholar
  16. 16.
    Iosif, R.: Symmetry reduction criteria for software model checking. In: Bošnački, D., Leue, S. (eds.) SPIN 2002. LNCS, vol. 2318, pp. 22–41. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  17. 17.
    Lerda, F., Visser, W.: Addressing dynamic issues of program model checking. In: Dwyer, M.B. (ed.) SPIN 2001. LNCS, vol. 2057, pp. 80–102. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  18. 18.
    McMillan, K.L.: Symbolic Model Checking. Kluwer Academic Publishers, Dordrecht (1993)zbMATHGoogle Scholar
  19. 19.
    Mehlitz, P.C., Visser, W., Penix, J.: The JPF runtime verification system. Online manual,
  20. 20.
    Musuvathi, M., Dill, D.L.: An incremental heap canonicalization algorithm. In: Bošnački, D., Edelkamp, S. (eds.) SPIN 2007. LNCS, vol. 4595, pp. 28–42. Springer, Heidelberg (2007)Google Scholar
  21. 21.
    Musuvathi, M., Engler, D.: Model checking large network protocol implementations. In: Proc. of The First Symposium on Networked Systems Design and Implementation (NSDI), pp. 155–168 (2004)Google Scholar
  22. 22.
    Musuvathi, M., Park, D., Chou, A., Engler, D.R., Dill, D.L.: CMC: A pragmatic approach to model checking real code. In: Proc. 5th Symposium on Operating Systems Design and Implementation, pp. 75–88 (December 2002)Google Scholar
  23. 23.
    Pacheco, C., Ernst, M.D.: Eclat: Automatic generation and classification of test inputs. In: Proc. 19th European Conference on Object-Oriented Programming, Glasgow, Scotland, pp. 504–527 (July 2005)Google Scholar
  24. 24.
    Park, D.Y., Stern, U., Skakkebæk, J.U., Dill, D.L.: Java model checking. In: Proc. of IEEE ASE 2000 (2000)Google Scholar
  25. 25.
    Perkins, C.E., Belding-Royer, E.M., Das, S.: Ad hoc on demand distance vector (aodv) routing. IETF Draft (January 2002)Google Scholar
  26. 26.
    Perkins, C.E., Royer, E.M.: Ad-hoc on-demand distance vector routing. In: Proc. IEEE Workshop on Mobile Computing Systems and Applications (WMCSA), pp. 90–100. IEEE Computer Society Press, Los Alamitos (1999)CrossRefGoogle Scholar
  27. 27.
    Robby, Dwyer, M.B., Hatcliff, J.: Bogor: an extensible and highly-modular software model checking framework. In: Proc. 9th European Software Engineering Conference held jointly with 11th ACM SIGSOFT International Symposium on Foundations of Software Engineering, pp. 267–276 (2003)Google Scholar
  28. 28.
    Sobeih, A., Viswanathan, M., Hou, J.C.: Incorporating bounded model checking in network simulation: Theory, implementation and evaluation. Technical Report UIUCDCS-R-2004-2466, Department of Computer Science, University of Illinois at Urbana-Champaign, Urbana, Illinois (July 2004)Google Scholar
  29. 29.
    Sobeih, A., Viswanathan, M., Marinov, D., Hou, J.C.: Finding bugs in network protocols using simulation code and protocol-specific heuristics. In: Lau, K.-K., Banach, R. (eds.) ICFEM 2005. LNCS, vol. 3785, pp. 235–250. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  30. 30.
    Stotts, D., Lindsey, M., Antley, A.: An informal formal method for systematic JUnit test case generation. In: Proc. 2002 XP/Agile Universe, pp. 131–143 (2002)Google Scholar
  31. 31.
    Tyan, H.-Y.: Design, Realization and Evaluation of a Component-based Compositional Software Architecture for Network Simulation. Ph.D., Department of Electrical Engineering, The Ohio State University (2002)Google Scholar
  32. 32.
    Veanes, M., Campbell, C., Schulte, W., Tillmann, N.: Online testing with model programs. In: ESEC/FSE-13: Proc. of the 10th European Software Engineering Conference and the 13th ACM SIGSOFT International Symposium on Foundations of Software Engineering, pp. 273–282. ACM Press, New York (2005)Google Scholar
  33. 33.
    Visser, W., Havelund, K., Brat, G., Park, S.: Model checking programs. In: Proc. 15th IEEE International Conference on Automated Software Engineering (2000)Google Scholar
  34. 34.
    Visser, W., Pasareanu, C.S., Khurshid, S.: Test input generation with Java PathFinder. In: Proc. 2004 ACM SIGSOFT International Symposium on Software Testing and Analysis, pp. 97–107 (2004)Google Scholar
  35. 35.
    Visser, W., Pasareanu, C.S., Pelanek, R.: Test input generation for red-black trees using abstraction. In: Proc. of IEEE/ACM International Conference on Automated Software Engineering (ASE), pp. 414–417 (2005)Google Scholar
  36. 36.
    Visser, W., Pasareanu, C.S., Pelanek, R.: Test input generation for Java containers using state matching. In: Proc. 2006 ACM SIGSOFT International Symposium on Software Testing and Analysis (2006)Google Scholar
  37. 37.
    Xie, T., Marinov, D., Notkin, D.: Rostra: A framework for detecting redundant object-oriented unit tests. In: Proc. 19th ASE, pp. 196–205 (September 2004)Google Scholar
  38. 38.
    Xie, T., Marinov, D., Schulte, W., Notkin, D.: Symstra: A framework for generating object-oriented unit tests using symbolic execution. In: Proc. 11th TACAS, pp. 365–381 (April 2005)Google Scholar
  39. 39.
    Yang, J., Twohey, P., Engler, D.R., Musuvathi, M.: Using model checking to find serious file system errors. In: OSDI, pp. 273–288 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Marcelo d’Amorim
    • 1
  • Ahmed Sobeih
    • 1
  • Darko Marinov
    • 1
  1. 1.Department of Computer ScienceUniversity of Illinois at Urbana-ChampaignUrbanaUSA

Personalised recommendations