Skip to main content

ISERN: A Distributed Experiment - Ein verteiltes Inspektionsexperiment

  • Chapter
Software-Messung und -Bewertung

Abstract

In the long run, high quality of software is prerequisite for software companies to survive. Inspections of software products help to detect and remove errors in software development early and cost-effectively. Thus, they help to enhance the quality of software products. One main goal in practice is, for a given situation, to select a suitable inspection technique. Empirical studies are necessary to make such a selection as they provide real-world data from a documented context.

The results of particular empirical studies, however, are not easily transferable to other situations without knowledge about their situation, their context. Thereby, it is especially unclear to which degree experimental results depend on their context, that is, which context factors influence the results. Insight into experimental results in different contexts can, for example, be gained by repeating, or replicating, experiments in these contexts. Thus, distributed experiments that are conducted in many different contexts and that systematically measure context factors are especially desirable, as they can improve the generalization of experimental results.

This paper describes a concept for a distributed inspection experiment being planned and conducted by the International Software Engineering Research Network (ISERN). Thereby, about 20 organizations from industry and research collaborate. The first step of the empirical study is a broad survey of software companies to reveal the state of the practice in the areas of inspection process and inspection techniques. The second step is to conduct experiments with documents from within ISERN and state-of-the-art inspection techniques, as well as with the participating organization’s own documents and inspection techniques. Goal of the distributed experiment is to describe where the variations in practice of inspections techniques are and to observe the effectiveness of these inspection techniques with respect to relevant context factors. From a research perspective, this experiment can help find out how experimental results can be generalized more effectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 44.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Basili, V., Caldiera, C., Rombach, D. The Experience Factory. Encyclopedia of Software Engineering, Vol.!, pp. 469–476, John Wiley & Sons, 1994.

    Google Scholar 

  2. Basili, V., Green, S., Laitenberger, O., Lanubile, F., Shull, F., Sorumgard, S., and Zelkowitz, M., 1996. The Empirical Investigation of Perspective-based Reading. Journal of Empirical Software Engineering, 2 (1): 133–164.

    Article  Google Scholar 

  3. Basili, V. R., Shull, F., Lanubile, F., 1998, Using Experiments to Build a Body of Knowledge, IEEE Transactions on Software Engineering, 25 (4): 456–474.

    Article  Google Scholar 

  4. Ciolkowski, M., Differding, C., Laitenberger, O., and Minch, J. 1997. Empirical Investigation of Perspective-based Reading: A Replicated Experiment. Fraunhofer Institute for Experimental Software Engineering, Germany, Tech. Report Int. Software Engineering Research Network, ISERN97–13.

    Google Scholar 

  5. Ciolkowski, M., 1999, Evaluating the Effectiveness of Different Inspection Techniques on Informal Requirements Documents, Master’s thesis, University of Kaiserslautern.

    Google Scholar 

  6. Gilb, T. and Graham, D., 1993. Software Inspection. Addison-Wesley Publishing Company.

    Google Scholar 

  7. Humphrey, W. H., 1990, Managing the Software Process, Addison-Wesley.

    Google Scholar 

  8. Laitenberger, O. May, Cost-effective Detection of Software Defects through Perspective-based Inspections. PhD thesis, University of Kaiserslautern, Germany, www.iese.fhg.de, 2000.

    Google Scholar 

  9. O. Laitenberger, and J.-M. DeBaud, “An encompassing life cycle centric survey of software inspection”, Journal of Systems and Software, vol. 50, no. 1, 2000, pp. 5–31.

    Article  Google Scholar 

  10. Linger, R. C., Mills, H. D., and Witt, B. I., 1979. Structured Programming: Theory and Practice. Addison-Wesley Publishing Company.

    Google Scholar 

  11. Miller, J. 1998. Applying Meta-Analytical Procedures to Software Engineering Experiments. TR EFoCS-30–98, University of Strathclyde.

    Google Scholar 

  12. Porter, A., and Votta, L. 1994. An Experiment to Assess Different Defect Detection Methods for Software Requirements Inspections. Proc. 16“ Int. Conf. on Software Engineering, Los Alamitos, CA, USA, 103–112.

    Google Scholar 

  13. Porter, A., and Johnson, P.M. 1997. Assessing Software Review Meetings: Results of a Comparative Analysis of Two Experimental Studies. IEEE Transactions on Software Engineering 23 (3): 129–144.

    Article  Google Scholar 

  14. Porter A., and Votta, L. 1998. Comparing Detection Methods for Software Requirements Inspections: a Replicated Experiment using professional subjects. Empirical Software Engineering: An International Journal 3(4): 355–379.

    Article  Google Scholar 

  15. Porter, A., Siy, H., Mockus, A. and Votta, L. Jan. 1998. Understanding the sources of variation in software inspections. ACM Transactions on Software Engineering and Methodology 7(1): 41–79.

    Article  Google Scholar 

  16. Porter, A. A., Votta, L. G., and Basili, V. R., 1995b. Comparing Detection Methods for Software Requirements Inspections: A Replicated Experiment. IEEE Transactions on Software Engineering, 21 (6): 563–575.

    Article  Google Scholar 

  17. Sauer, C., Jeffery, D., Land, L. and Yetton, P. Jan. 2000. The Effectiveness of Software Development Technical Reviews: A Behaviorally Motivated Program of Research. IEEE Transactions on Software Engineering 26 (1): 11–14.

    Article  Google Scholar 

  18. Schouwen, J. van, Parnas, D. L., Madey, J., 1993, Documentation of Requirements for Computer Systems, IEEE International Symposium on Requirements Engineering.

    Google Scholar 

  19. W. Tichy. Should computer Scientists Experiment More?. IEEE Computer, May 1998.

    Google Scholar 

  20. Wohlin, C., Aurum, A., Petersson, H., Shull, F., and Ciolkowski, M. Software Inspecion benchmarking—A Qualitative and Quantitative Comparative Opportunity. Submission to Metrics 2000.

    Google Scholar 

  21. M. Zelkowitz, D. Wallace. Experimental Models for Validating Technology. IEEE Computer, May 1998.

    Google Scholar 

  22. Zendler A. 2001. A Preliminary Software Engineering Theory as Investigated by Published Experiments. Empirical Software Engineering: An International Journal 6: 161–180.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Reiner Dumke Dieter Rombach

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer Fachmedien Wiesbaden

About this chapter

Cite this chapter

Ciolkowski, M., Biffl, S., Rombach, D. (2002). ISERN: A Distributed Experiment - Ein verteiltes Inspektionsexperiment. In: Dumke, R., Rombach, D. (eds) Software-Messung und -Bewertung. Information Engineering und IV-Controlling. Deutscher Universitätsverlag, Wiesbaden. https://doi.org/10.1007/978-3-663-11381-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-663-11381-2_8

  • Publisher Name: Deutscher Universitätsverlag, Wiesbaden

  • Print ISBN: 978-3-8244-7592-6

  • Online ISBN: 978-3-663-11381-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics