Advertisement

Automated Software Engineering

, Volume 13, Issue 3, pp 373–394 | Cite as

A family of experiments to investigate the effects of groupware for software inspection

  • Stefan Biffl
  • Paul Grünbacher
  • Michael Halling
Article

Abstract

It is widely accepted that the inspection of software artifacts can find defects early in the development process and gather information on the quality of the evolving product. However, the inspection process is resource-intensive and involves tedious tasks, such as searching, sorting, and checking. Tool support for inspections can help accelerating these tasks and allows inspectors to concentrate on tasks particularly needing human attention. Only few tools are available for inspections. We have thus developed a set of groupware tools for both individual defect detection and inspection meetings to lower the effort of inspections and to increase their efficiency. This paper presents the Groupware-supported Inspection Process (GrIP) and describes tools for inspecting software requirements. As only little empirical work exists that directly compares paper-based and tool-based software inspection, we conducted a family of experiments in an academic environment to empirically investigate the effect of tool support regarding defect detection and inspection meetings. The main results of our family of experiments regarding individual defect detection are promising: The effectiveness of inspectors and teams is comparable to paper-based inspection without tool support; the inspection effort and defect overlap decreases significantly with tool support, while the efficiency of inspection teams increases considerably. Regarding tool support for inspection meetings the main findings of the experiments are that tool support considerably lowers the meeting effort, supports inspectors in identifying false positives, and reduces the number of true defects lost during a meeting. The number of unidentified false positives is still quite high.

Keywords

Software inspection Defect detection Inspection meeting Tool support Software quality measurement Controlled experiment Empirical software engineering 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson, P., Reps, T., et al.: Design and Implementation of a Fine-Grained Software Inspection Tool. IEEE Trans. Softw. Eng. 29(8), 721–733 (2003)CrossRefGoogle Scholar
  2. Aurum, A., Petersson, H., et al.: State-of-the-Art: Software Inspections after 25 Years. Softw. Test. Verif. Rel. 12(3), 134–154 (2001)Google Scholar
  3. Basili, V., Green, S., et al.: The Empirical Investigation of Perspective-Based Reading. Empirical Softw. Eng.: Int. J. 1(2), 133–164 (1996)CrossRefGoogle Scholar
  4. Bianchi, A., Lanubile, F., et al.: A Controlled Experiment to Assess the Effectiveness of Inspection Meetings. Metrics 01, London. pp. 42–50 (2001)Google Scholar
  5. Biffl, S.: Software Inspection Techniques to support Project and Quality Management. Shaker Publishing, Aachen (2001)Google Scholar
  6. Biffl, S., & Halling, M.: Investigating the influence of inspector capability factors with four inspection techniques on inspection performance. 8th IEEE Int. Software Metrics Symposium, IEEE Comp. Soc. Press, Ottawa, pp. 107–117 (2002)Google Scholar
  7. Biffl, S., & Halling, M.: Investigating the Defect Detection Effectiveness and Cost-Benefit of Nominal Inspection Teams. IEEE Trans. Softw. Eng. 29(5), 385–397 (2003)CrossRefGoogle Scholar
  8. Boehm, B.W., Grünbacher, P., et al.: Developing Groupware for Requirements Negotiation: Lessons Learned. IEEE Softw. 18(3), 46–55 (2001)CrossRefGoogle Scholar
  9. Ciolkowski M., Shull F., & Biffl St.: A Family of Inspection Experiments, Proc. Int. Conf. on Empirical Assessment of Software Engineering (EASE), Keele, April 2002 (2002)Google Scholar
  10. Fagan, M.: Design and Code Inspections To Reduce Errors In Program Development. IBM Syst. J. 15(3), 182–211 (1976)CrossRefGoogle Scholar
  11. Genuchten, M., Cornelissen, W., et al.: Supporting Inspections With an Electronic Meeting System. JMIS 14(3), 165–178. (1998)Google Scholar
  12. Genuchten, M., Dijk, C., et al.: Industrial Experience in Using Group Support Systems for Software Inspections. IEEE Softw. 18(3), 60–65 (2001)CrossRefGoogle Scholar
  13. Gilb, T., & Graham, D.: Software Inspection. Boston, MA, USA, Addison Wesley Professional. (1993)Google Scholar
  14. Grünbacher, P., Halling, M., et al.: An Empirical Study on Groupware Support for Software Inspection Meetings. 18th IEEE International Conference on Automated Software Engineering, Montreal, IEEE Computer Society. pp. 4–11 (2003)Google Scholar
  15. Halling, M.: Supporting Management Decisions in the Software Inspection Process, Vienna University of Technology. (2002)Google Scholar
  16. Halling, M., & Biffl S.: Investigating the Influence of Software Inspection Process Parameters on Inspection Meeting Performance. IEE Proc.-Softw. 149(5), (2002)Google Scholar
  17. Halling, M., Biffl, S., et al.: Using Reading Techniques to Focus Inspection Performance. 27th Euromicro Conference 2001: A Net Odyssey (Euromicro’01), IEEE Computer Society Press, Warsaw, pp. 248–257 (2001)Google Scholar
  18. Halling, M., Biffl, S., et al.: A Groupware-Supported Inspection Process for Active Inspection Management. Proceedings of the 28th Euromicro Conference (EUROMICRO’02), Dortmund Germany, IEEE CS, pp. 251–258 (2002)Google Scholar
  19. Halling, M., Biffl, S., et al.: An Economic Approach for Improving Requirements Negotiation Models with Inspection. Requirements Eng. J., Springer (8), 236–247 (2003a)Google Scholar
  20. Halling, M., Biffl, S., et al.: An Experiment Family to Investigate the Defect Detection Effect of Tool-Support for Requirements Inspection. Proceedings of the Ninth International Software Metrics Symposium (METRICS’03), IEEE Comp. Soc. Press, Sydney, pp. 278–285 (2003b)Google Scholar
  21. Halling, M., Grünbacher, P., et al.: Groupware Support for Software Requirements Inspection. WISE’01: Proceedings of the 1st Workshop on Inspection in Software Engineering, Paris, France, Software Quality Research Lab, McMaster University, Hamilton, Canada, pp. 20–29 (2001a)Google Scholar
  22. Halling, M., Grünbacher, P., et al.: Tailoring a COTS Group Support System for Software Requirements Inspection. 16th IEEE International Conference on Automated Software Engineering, IEEE Computer Society, San Diego, pp. 201–210 (2001b)Google Scholar
  23. Höst, M., Regnell, B., et al.: Using Students as Subjects—A Comparative Study of Students and Professionals in Lead-Time Impact Assessment. Empirical Softw. Eng. 5, 201–214 (2000)CrossRefMATHGoogle Scholar
  24. Johnson, P.M., & Tjahjono, D.: Assessing software review meetings: A controlled experimental study using CSRS. International Conference on Software Engineering, Boston, pp. 118-127 (1997)Google Scholar
  25. Johnson, P.M., & Tjahjono, D.: Does Every Inspection Really Need a Meeting? Empirical Softw. Eng. (1998)Google Scholar
  26. Knight, J.C., & Myers, E.A.: An improved inspection technique. C. ACM 36(11), 51–61 (1993)CrossRefGoogle Scholar
  27. Laitenberger, O., & DeBaud, J.-M.: An encompassing life cycle centric survey of software inspection. J. Sys. Softw. 50(1), 5–31 (2000)CrossRefGoogle Scholar
  28. Laitenberger, O., & Dreyer, H.M.: Evaluating the usefulness and the ease of use of a Web-based inspection data collection tool. Proceedings Fifth Intl. Software Metrics Symposium, pp. 122–132 (1998)Google Scholar
  29. Land, L.P.W., Jeffery, R., et al.: Validating the Defect Detection Performance Advantage of Group Designs for Software Reviews. ESEC/FSE, pp. 17–26 (1997)Google Scholar
  30. Lanubile, F., & Mallardo T.: An Empirical Study of Web-Based Inspection Meetings. ACM/IEEE Int. Sym. on Empirical Software Engineering, Rome, pp. 244–251 (2003)Google Scholar
  31. MacDonald, F., & Miller J.: A Comparison of Tool-Based and Paper-Based Software Inspection. Empirical Softw. Eng. 3, 233–253 (1998)Google Scholar
  32. MacDonald, F., & Miller, J.: A Comparison of Computer Support Systems for Software Inspection. Automated Softw. Eng. 6, 291–313 (1999)CrossRefGoogle Scholar
  33. MacDonald, F., Miller, J., et al.: Automatic Collation of Software Inspection Defect Lists. Proceedings of the 8th International Conference on Information Systems Development, Boise, Idaho, pp. 1–11 (1999)Google Scholar
  34. MacDonald, F., Miller, J., et al.: ASSISTing Management Decisions in the Software Inspection Process. J. Inf. Technol. Manage. 3, 67–83 (2002)CrossRefGoogle Scholar
  35. Nunamaker, J.F., Briggs, R.O., et al.: Lessons from a Dozen Years of Group Support Systems Research: A Discussion of Lab and Field Findings. J. Manage. Inf. Sys. 13(3), 163–207 (1997)Google Scholar
  36. Parnas, D.L., & Lawford M.: The Role of Inspection in Software Quality Assurance. IEEE Trans. Softw. Eng. 29(8), 674–676 (2003)CrossRefGoogle Scholar
  37. Parnas, D.L., & Weiss, D.M.: Active design review: principles and practices. 8th Int. Conf. on Software Engineering, pp. 259–265 (1985)Google Scholar
  38. Parnas, D.L., & Weiss, D.M. Active Design Reviews: Principles and Practice. J. Sys. Softw. 7, 259–265 (1987)Google Scholar
  39. Perpich, J.M., Perry, D.E., et al.: Anywhere, anytime code inspections: using the Web to remove inspection bottlenecks in large-scale software development. Proc. of the 19th ICSE, Boston, USA, pp. 14–21 (1997)Google Scholar
  40. Porter, A.A., & Johnson, P.M.: Assessing Software Review Meetings: Results of a Comparative Analysis of Two Experimental Studies. IEEE Trans. Softw. Eng. 23(3), (1997)Google Scholar
  41. Sauer, C., & Jeffery, D., et al.: The Effectiveness of Software Development Technical Reviews: A Behaviorally Motivated Program of Research. IEEE Trans. Softw. Eng. 26(1), 11–14 (2000)CrossRefGoogle Scholar
  42. Thelin, T., Andersson, P., et al.: Tool Support for Usage-based reading. Proceedings IASTED Software Engineering Conference, Innsbruck, Austria, pp. (2004)Google Scholar
  43. Tichy, W.: Hints for Reviewing Empirical Work in Software Engineering. Empirical Softw. Eng.: An Int. J. 5, 309–312 (2001)CrossRefGoogle Scholar
  44. Vitharana, P., & Ramarmurthy, K.: Computer-Mediated Group Support, Anonymity, and the Software Inspection Process: An Empirical Investigation. IEEE Trans. Softw. Eng. 29(2), 167–180 (2003)CrossRefGoogle Scholar
  45. Votta, L.: Does every Inspection need a Meeting? ACM Softw. Eng. Notes 18(5), 107–114 (1993)Google Scholar
  46. Wohlin, C., Runeson, P., et al.: Experimentation in Software Engineering: An Introduction, The Kluwer International Series in Software Engineering. Kluwer Academic Publishers. (2000)Google Scholar

Copyright information

© Springer Science + Business Media, LLC 2006

Authors and Affiliations

  • Stefan Biffl
    • 1
  • Paul Grünbacher
    • 2
  • Michael Halling
    • 3
  1. 1.Institute of Software Technology and Interactive SystemsVienna University of TechnologyViennaAustria
  2. 2.Christian Doppler Laboratory for Automated Software EngineeringJohannes Kepler University LinzLinzAustria
  3. 3.Department of FinanceUniversity of ViennaViennaAustria

Personalised recommendations