Advertisement

An Empirical Investigation on the Impact of Training-by-Examples on Inspection Performance

  • Atiq Chowdhury
  • Lesley Pek Wee Land
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3009)

Abstract

Software inspection is often seen as a technique to produce quality software. It has been claimed that expertise is a key determinant in inspection performance particularly in individual detection and group meetings [33]. Uncertainty, among reviewers during the group meetings due to lack of expertise is seen as a weakness in inspection performance. One aspect of achieving expertise is through education or formal training. Recent theoretical frameworks in software inspection also support the idea of possible effects of training on inspection performance [33]. A laboratory experiment was conducted to test the effects of training by examples on requirements inspection. Our findings show that the trained group performs significantly better than the group which received no training. However, the ‘experienced’ reviewers did not outperform those with no experience. The results have implications the use of a repository of defect examples for training reviewers.

Keywords

Defect Type Requirement Document Inspection Process Automatic Teller Machine Defect Class 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Basili, V.R., Green, S., Laitenburger, O., Lanubile, F., Shull, F., Sorumgard, S., Zelkowitz, M.: The Empirical Investigation of Perspective-Based Reading. Empirical Software Engineering: An International Journal 2(1), 133–164 (1996)CrossRefGoogle Scholar
  2. 2.
    Biffl, S., Halling, M.: Investigating the Influence of Inspectors capability factors with Four Inspection Techniques on Inspection Performance. In: Proc. IEEE Int’l Software Metrics Symp. (2002)Google Scholar
  3. 3.
    Biffl, S., Halling, M.: Investigating the Defect Detection Effectiveness and Cost Benefit of Nominal Inspection Teams. IEEE Transaction on Software Engineering 29(5), 385–387 (2003)Google Scholar
  4. 4.
    Boehm, B.W.: Software Engineering Economics. In: Advances in Computing Science and Technology, Prentice Hall, Englewood Cliffs (1981)Google Scholar
  5. 5.
    Carver, J., Shull, F., Basili, V.: Investigating the Effects of Process Experience on Inspection Effectiveness, University of Maryland Technical Report CSTR- 4442 (March 2003)Google Scholar
  6. 6.
    Carver, J.: The Impact of Background and Experience on Software Inspections. PhD Thesis, University of Maryland (April 2003)Google Scholar
  7. 7.
    Carver, J., Basili, V.: Identifying Implicit Process Variables To Support Future Empirical Work. In: Proceedings of the 17th Brazilian Symposium on Software Engineering (SBES 2003), October 2003, pp. 5–18 (2003)Google Scholar
  8. 8.
    Charney, D.H., Reder, L.M.: Designing interactive tutorials for computer users: Effects of the form and spacing of practice on skill learning. Human-Computer Interaction 2, 297–317 (1986)CrossRefGoogle Scholar
  9. 9.
    Chase, W.G., Simon, H.A.: The mind’s Eye in Chess. In: Chase, W.G. (ed.) Visual Information Processing, pp. 215–281. Academic, New York (1973)Google Scholar
  10. 10.
    Cheng, B., Jeffery, R.: Comparing Inspection Strategies for Software Requirements Specifications. In: Proceeding of the 1996 Australian Software Engineering Conference, pp. 203–211 (1996)Google Scholar
  11. 11.
    Cooper, G., Sweller, J.: The effects of schema acquisition and rule automation on mathematical problem-solving transfer. Journal of Educational Psychology 79, 347–362 (1987)CrossRefGoogle Scholar
  12. 12.
    Dansereau, D.F.: The development of a learning strategies curriculum. In: O’Neil Jr., H.F. (ed.) Learning Strategies, Academic Press, New York (1978)Google Scholar
  13. 13.
    Fagan, M.E.: Design and code inspections to reduce errors in program development. IBM Systems Journal 15(3), 182–211 (1976)CrossRefGoogle Scholar
  14. 14.
    Fowler, P.J.: In-process inspections of work-products at AT&T. At&T Journal, 102–112 (March/April 1986)Google Scholar
  15. 15.
    Freedman, D.P., Weinberg, G.M.: Handbook of Walkthroughs, Inspections and Technical Reviews: Evaluating Programs, Projects, and Products, 3rd edn. Dorest House Publishing, New York (1990)Google Scholar
  16. 16.
    Gilb, T., Graham, D.: Software Inspection. Addison Wesley, Reading (1993)Google Scholar
  17. 17.
    Goska, R., Ackerman, P.: An Aptitude-Treatment Interaction Approach to Transfer With-in Training. Journal of Educational Psychology 88(2), 249–259 (1996)CrossRefGoogle Scholar
  18. 18.
    Host, M., Regnell, B., Wohlin, C.: Using Students as Subjects – A Comparative Study of Students and Professionals in Lead-Time Impact Assessment. Empirical Software Engineering 5, 201–214 (2000)CrossRefGoogle Scholar
  19. 19.
    Humphrey, W.S.: A Discipline for Software Engineering. Addison-Wesley Publishing Company, Reading (1995)Google Scholar
  20. 20.
    Kim, L.P.W., Sauer, C., Jeffery, R.: A Framework for software development technical reviews. In: Lee, M., Barta, B.-Z., Juliff, P. (eds.) Software Quality and Productivity: Training, Practice, Education and training, pp. 294–299. IFIP/Chapman and Hall (1995)Google Scholar
  21. 21.
    Laitenburg, O., DeBraud, J.: An Encompassing Life Cycle Centric Survey of Software Inspection. Journal of Systems and Software 50(1), 5–31 (2000)CrossRefGoogle Scholar
  22. 22.
    Land, L.: Software Group Reviews and the Impact of Procedural Roles on Defect Detection Performance, Thesis, University of New South Wales (2000)Google Scholar
  23. 23.
    Land, L., Wong, B., Jeffery, R.: An Extension Behavioural Theory of Group Performance in Software Development Technical Reviews. APSEC (2003)Google Scholar
  24. 24.
    Lieberman, H.: An Example Based Environment for beginning Programmers. Instructional Science 14(3), 277–292 (1986)CrossRefGoogle Scholar
  25. 25.
    Melo, W., Shull, F., Travassos, G.H.: Software Review Guidelines Technical Report ES-556/01 Systems Engineering and Computer Science Program. COPPE. Federal University of Rio de Janeiro (September 2001)Google Scholar
  26. 26.
    Miller, J., Daly, J., Wood, M., Roper, M., Brooks, A.: Statistical power and its subcomponents – missing and misunderstood concepts in empirical software engineering research. Information and Software Technology 39, 285–295 (1997)CrossRefGoogle Scholar
  27. 27.
    O’Neil, H.F., Spielberger, C.: Cognitive and Affective Learning Strategies. Academic Press, London (1979)Google Scholar
  28. 28.
    Parnas, D.L., Weiss, D.M.: Active Design Reviews: Principles and Practices. In: Proc of the 8th International Conferences on Software Engineering, pp. 418–426 (1985)Google Scholar
  29. 29.
    Porter, A.A., Votta, L.G., Basili, V.R.: Comparing detection methods for software requirements inspections: A replicated experiment. IEEE Transactions on Software Engineering 21(6), 563–575 (1995)CrossRefGoogle Scholar
  30. 30.
    Porter, A.A., Johnson, P.M.: Assessing Software Review Meetings: Results of a Comparative Analysis of Two Experimental Studies. IEEE Transaction on Software Engineering 23(3), 129–145 (1997)CrossRefGoogle Scholar
  31. 31.
    Porter, A.A., Votta, L.G.: What makes Inspection Work?, IEEE Software, 99–102 (1997)Google Scholar
  32. 32.
    Proctor, R.W., Dutta, A.: Skill Acquisition and Human Performance. Sage, Thousand Oaks (1995)Google Scholar
  33. 33.
    Sauer, C., Jeffery, R., Land, L., Yetton, P.: The Effectiveness of Software Development Technical Reviews: A Behaviourally Motivated Program of Research. IEEE Transaction on Software Engineering 26(1) (January 2000)Google Scholar
  34. 34.
    Schneider, G.M., Martin, J., Tsai, W.T.: An experiment study of fault detection in user requirements document. ACM Transactions on Software Engineering and Methodology 1(2), 188–204 (1992)CrossRefGoogle Scholar
  35. 35.
    Shull, F., Carver, J., Travassos, G.: An Empirical Methodology for Introducing Software Processes. In: Proceedings of the Joint 8th European Software Engineering Conference (ESEC) and 9th ACM SIGSOFT Symposium on the Foundations of Software Engineering (FSE-9), Vienna, Austria, September 10-14, pp. 288–296 (2001)Google Scholar
  36. 36.
    Sweller, J., Cooper, G.A.: The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction 2, 59–89 (1985)CrossRefGoogle Scholar
  37. 37.
    Sweller, J.: Instructional Design in Technical Areas. ACER Press (1999)Google Scholar
  38. 38.
    Wangari, M., Sweller, J.: Learning to Solve Compare Word Problems: The Effect of Example Format and Generating Self Explanation. Cognition and Instruction 16(2), 173–199 (1998)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Atiq Chowdhury
    • 1
  • Lesley Pek Wee Land
    • 1
  1. 1.School of Information Systems, Technology and ManagementThe University Of New South WalesSydneyAustralia

Personalised recommendations