Advertisement

Incremental Test Case Generation for Distributed Object-Oriented Systems

  • Holger Fuchs
Part of the Informatik Aktuell book series (INFORMAT)

Abstract

More and more distributed object-oriented software systems (DOOS) have appeared, but not much work exists on testing of these systems in an integrated manner. Instead, the distributed and object features have been tested separately. This paper is dedicated to the test case derivation phase in the development process of a DOOS. It describes our work in addressing a systematic approach and reports a framework for testing distributed object-oriented systems, known as TeDOOS. TeDOOS uses a hierarchical decomposition technique to reduce complexity. Each level has its own fault model, test strategy, and test case derivation scheme that addresses the specific requirements of the distributed and object paradigms at their level of abstraction. However, some of the test models already known are adapted and reused at different levels for easy understandability. Moreover, each test level utilizes the test results from the previous level to contain the global test space to a manageable size so that the overall test effort is minimized. The incremental test cases generation for a DOOS is illustrated using the example of a banking system.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aho, A.V.; Dahbura, A.T.; Lee, D.; Uyar, M. Ü.: An optimization technique for protocol conformance test generation based, on UIO sequences and rural Chinese postman tours; IEEE Transactions on Communications, vol. 39, no. 11, 1991, pp. 1604–1615.CrossRefGoogle Scholar
  2. 2.
    Andleigh, P.K.; Gretzinger, M.R.: Distributed object oriented data-systems design. Prentice Hall, 1992.zbMATHGoogle Scholar
  3. 3.
    Atkinson, C.: Object-Oriented Reuse, Concurrency and Distribution: an Ada-based approach. Addison-Wesley, 1991.zbMATHGoogle Scholar
  4. 4.
    Chanson, S.T.; Zhu, J.: Automatic protocol suite derivation. Proceedings of INFOCOM ’94 Conference on Computer Communications, vol. 2, 1994, pp. 792–799.Google Scholar
  5. 5.
    Chin, R.S.; Chanson, S.T.: Distributed object-based programming systems. ACM Computing Surveys, vol. 23, no. 1, 1991, pp. 91–124.CrossRefGoogle Scholar
  6. 6.
    Ghezzi, C.; Jazayeri, M.; Mandrioli, D.: Fundamentals of Software Engineering; Prentice-Hall International; 1991.zbMATHGoogle Scholar
  7. 7.
    Graham, I: Object-Oriented Methods. Addison-Wesley, 1994.Google Scholar
  8. 8.
    Hayes, J.H.: Testing of object-oriented programming (OOPS): A fault-based approach. Proceedings of 14th ICSE, IEEE Press, 1992, pp. 205–220.Google Scholar
  9. 9.
    Harrold, M.J., Rothermel, G.: Performing data flow testing on classes. SIGSOFT Software Engineering Notes, vol. 19, no. 5, 1994, pp. 154–163.CrossRefGoogle Scholar
  10. 10.
    Hoffmann, D.M.; Strooper, P. A.: ClassBench: a Framework for Automated Class Testing. Software-Practice and Experience, vol. 27, no. 5, 1997, pp. 573–597.CrossRefGoogle Scholar
  11. 11.
    Holzmann, G.J.: Design and validation of computer protocols. Prentice-Hall, 1991.Google Scholar
  12. 12.
    Jorgensen, P.C.; Erickson, C.: Object-oriented integration testing. Communications of the ACM, vol. 37, no. 9, 1994, pp. 30–33.CrossRefGoogle Scholar
  13. 13.
    Jorgensen, P.C.: Software testing-a craftsman’s approach. CRC Press, 1995.Google Scholar
  14. 14.
    Kim, M.; Chanson, S.T.; Kang, S.: An approach for testing asynchronous communicating systems. Proceedings of IWTCS’96, 1996, pp. 141–155.Google Scholar
  15. 15.
    Kim, M.C.; Chanson, S.T.; Kim, G.H.: Concurrency model and its application to formal specifications of asynchronous protocols. Proceedings of IEEE GLOBECOM, vol. 3, 1995, pp. 1580–4.Google Scholar
  16. 16.
    Kirani, S.; Tsai, WT.: Specification and verification of object-oriented programs. Technical report, University of Minnesota, 1994.Google Scholar
  17. 17.
    Kung, D.C.; Gao, J.; Hsia, P. et. al.: On regression testing of object-oriented programs. Journal of Systems and Software, vol. 32, no. l, 96, pp. 21–40.Google Scholar
  18. 18.
    Koskimies, K.; Makinen, E: Automatic synthesis of state machines from trace diagrams. Software Practice and Experience, vol. 24, no. 7, 1994, pp. 643–658.CrossRefGoogle Scholar
  19. 19.
    Kung, D.; Gao, J.; Toyoshima, Y. et. al.:Developing an Object-Oriented Software Testing and Maintenance environment. Communications of the ACM, vol. 38, no. 10, 1995, pp. 75–87.CrossRefGoogle Scholar
  20. 20.
    Marick, B.: The craft of software testing-subsystem testing including object-based and object-oriented testing. Prentice Hall, 1995.Google Scholar
  21. 21.
    McGregor, J.D.; Korson, T.D.: Integrating object-oriented testing and development processes. Communications of the ACM, vol. 37, no. 9, 1994, pp. 59–77.CrossRefGoogle Scholar
  22. 22.
    Mueller, F.; Whalley, D.B.; Le Charlier, B.:Efficient on-the-fly analysis of program behavior and static cache simulation. Proceedings of First International Static Analysis Symposium, SAS’94, Springer-Verlag, 1994, pp. 101–15.Google Scholar
  23. 23.
    Murphy, G.C.; Townsend, P.; Pok, S.W.:Experiences with cluster and class testing. Communications of the ACM, vol. 37, no. 9, 1994, pp. 48–58.CrossRefGoogle Scholar
  24. 24.
    Myers, G.J.: The Art of Software Testing. John-Wiley & Sons, New York, U.S.A., 1979.Google Scholar
  25. 25.
    Poston, R.M.: Automating specification-based software testing. IEEE Press, 1996.Google Scholar
  26. 26.
    Roper, M.: Software Testing. McGraw-Hill Book Company, Berkshire, England, 1994.Google Scholar
  27. 27.
    Rumbaugh, J.; Blaha, M.; Premerlani, W.; Eddy, F.; Lorensen, W.:Object-Oriented Modeling and Design. Prentice Hall, 1991.Google Scholar
  28. 28.
    Ryan, T.W.: Distributed object technology: concepts & applications. Prentice Hall, 1997.Google Scholar
  29. 29.
    Siegel, S.: Object oriented software testing. John Wiley & Sons, 1996.Google Scholar
  30. 30.
    Smith, M.D.; Robson, D.J.: A framework for testing object-oriented programs, journal of object-oriented programming, vol. 5, no. 3, 1992, pp. 45–53.Google Scholar
  31. 31.
    Tai, K.C.; Carver, R.H.: Testing of distributed programs. Handbook of parallel and distributed computing, McGraw Hill, 1995, pp. 955–978.Google Scholar
  32. 32.
    Ulrich, A.: A Description model to support test suite derivation for concurrent systems. Kommunikation in verteilten systemen (KiVS’97), Springer Verlag, 1997, pp. 151–166.CrossRefGoogle Scholar
  33. 33.
    Ural, H.: Testing sequence selection based on static dataflow analysis. Computer communication, 10(5), 1987.Google Scholar
  34. 34.
    Wong, C.Y.; Chanson, S.T.; Cheung, S.C.; Fuchs, H.: A framework for distributed object-oriented testing. Proc. Conference FORTE/PSTV’97; Osaka; Chapman & Hall, 1997, pp. 39–56.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Holger Fuchs
    • 1
  1. 1.Institute for Technical Information SystemsUniversity of MagdeburgGermany

Personalised recommendations