A Study of Tool Support for the Evaluation of Programming Exercises

  • Heinz Dobler
  • Rudolf Ramler
  • Klaus Wolfmaier
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4739)


To foster the process of learning to program, theory and exercises are necessary. Traditionally, manual review of the source is used to provide feedback for the solutions. The effort is substantial and identified problems are prone to subjective interpretation. We study static analysis and testing tools as an enhancement to reviews and discuss the benefits. We analyze our findings by comparing the results from analysis by cross-checking the outcomes of the different tools with each other, with the tutors’ remarks, and with the course outcome additionally taking into account final examination results. The effort was substantial and it turned out, that the tools are no replacement for manual review. Tool support is an enhancement due to localization of problem areas, accurate check of programming rules, and an efficient way to detect plagiarism.


Manual Review Symbolic Execution Static Program Analysis Dead Code Memory Leak 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    The Joint Task Force: Computing Curricula 2001, Computer Science, ACM (2001)Google Scholar
  2. 2.
    Jazayeri, M.: The Education of a Software Engineer. In: Proceedings of 19th IEEE Conference on Automated Software Engineering, Linz, Austria (2004)Google Scholar
  3. 3.
    Ghezzi, C., Jazayeri, M., Mandrioli, D.: Fundamentals of Software Engineering, 2nd edn. Prentice-Hall, Englewood Cliffs (2003)Google Scholar
  4. 4.
    Tian, J.: Software Quality Engineering – Testing, Quality Assurance, and Quantifiable Improvement. John Wiley & Sons, West Sussex, England (2005)Google Scholar
  5. 5.
    McConnel, S.: Code Complete. Microsoft Press, Redmond, Washington (2004)Google Scholar
  6. 6.
    Wiegers, K.E.: Peer Reviews in Software. Addison Wesley, London, UK (2002)Google Scholar
  7. 7.
    Spillner, A., Linz, T., Schaefer, H.: Software Testing Foundations, dpunkt (2006)Google Scholar
  8. 8.
    Josuttis, N.: The C++ Standard Library–A Tutorial and Reference. Addison-Wesley, London, UK (1999)Google Scholar
  9. 9.
    The Python Programming Language,
  10. 10.
    Bullseye Code Coverage Tool,
  11. 11.
    Ramler, R., Wolfmaier, K., Dobler, H., Altmann, J.: State of the Art in Static Program Analysis and Unit Testing (in German), Technical Report SCCH 0323 (2003)Google Scholar
  12. 12.
    pclint Static Analysis Tool,
  13. 13.
    Visual Studio, Development Environment (2005),
  14. 14.
    Bush, W.R., Pincus, J.D., Sielaff, D.J.: A static analyzer for finding dynamic programming errors. Software—Practice and Experience (June 2000)Google Scholar
  15. 15.
    Larus, J.R., Ball, T., Das, M., DeLine, R., Fähndrich, M., Pincus, J., Rajamani, S.K., Venkatapathy, R.: Righting Software. IEEE Software (May/June 2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Heinz Dobler
    • 1
  • Rudolf Ramler
    • 2
  • Klaus Wolfmaier
    • 2
  1. 1.University of Applied Sciences, Softwarepark 11, 4232 HagenbergAustria
  2. 2.Software Competence Center Hagenberg, Softwarepark 21, 4232 HagenbergAustria

Personalised recommendations