Advertisement

Abstract

The following paper describes a measurement project to measure and evaluate the software application systems of a financial services provider. Due to several mergers the cooperation had accumulated over the years more than 75 million lines of code in several different programming languages. The goal of the project was to determine the size, complexity and quality of the different systems and to evaluate their potential reuse. Not only the program source, but also the database schemas, the JCL procedures and the user interface maps had to be analyzed. For this purpose a metric database was established. In the measurement project three related tools were used. The tool SoftAudit was deployed to measure the code. The tool SoftEval was used to aggregate the measurement data in a metric database and to evaluate it. The tool SoftCalc was used to calculate the costs of various strategic alternatives. The paper focuses on the problems and solutions associated with such a massive measurement effort of large code bases.

Keywords

Code measurement size complexity and quality metrics metric database metric evaluation ISO-9126 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    ISO/IEC: Software Product Evaluation: Quality Characteristics and Guidelines for their use, ISO/IEC Standard 9126, International Standards Organization, Genf (1994)Google Scholar
  2. 2.
    Zuse, H.: A Framework of Software Measurement. de Gruyter Verlag, New York (1998)CrossRefGoogle Scholar
  3. 3.
    McCabe, T.: A Complexity Measure. IEEE Trans S.E. 2(6), 308 (1976)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Halstead, M.: Elements of Software Science, p. 79. Elsevier Pub., New York (1977)zbMATHGoogle Scholar
  5. 5.
    Kemerer, C., Chidamber, S.: A Metrics Suite for Object-Oriented Design. IEEE Trans. S.E. 20(6), 476 (1994)CrossRefGoogle Scholar
  6. 6.
    Albrecht, A., Gaffney, J.: Software Function, Source Lines of Code and Development Effort Prediction: A Software Science Validation. IEEE Transactions on Software Engineering 9(6), 639 (1983)CrossRefGoogle Scholar
  7. 7.
    Welker, K., Oman, P., Atkinson, G.: Development and Application of an automated Source Code Maintainability Index. Journal of Software Maintenance 9(3), 127 (1997)CrossRefGoogle Scholar
  8. 8.
    IEEE: Software Engineering Standards, Volume Three, Product Standards. IEEE Computer Society Press, Los Alamitos (1999)Google Scholar
  9. 9.
    Hughes, B.: Practical Software Measurement. McGraw-Hill, Maidenhead (2000)Google Scholar
  10. 10.
    Moore, J.W.: Software Engineering Standards – A User’s Road Map. IEEE Computer Society Press, Los Alamitos (1998)zbMATHGoogle Scholar
  11. 11.
    Bush, M., Fenton, N.: Software Measurement – A conceptual Framework. EC-Esprit Project 2348, Report 2, South Bank University, London (1990)Google Scholar
  12. 12.
    Sneed, H.: MetKit Metric Data Model, EC-Esprit Project 2348, Report 9, SES GmbH, Munich (1991)Google Scholar
  13. 13.
    Sneed, H.: Applying Size, Complexity and Quality Metrics to an object-oriented Application. In: ESCOM Conference Proceedings, Hercmoncieux, GB, p. 92 (1999)Google Scholar
  14. 14.
    Dumke, R., Foltin, E., Koeppe, R., Winkler, A.: Softwarequalität durch Meßtools, p. 198. Vieweg Verlag, Braunschweig (1996)CrossRefGoogle Scholar
  15. 15.
    Chapin, N.: A Measure of Software Complexity. In: Proc. of NCC, p. 995 (1977)Google Scholar
  16. 16.
    Elshof, J.: An Analysis of Commercial PL/I Programs. IEEE Trans. S.E. 2(3), 306 (1976)Google Scholar
  17. 17.
    Card, D., Glass, R.: Measuring Software Design Quality, p. 23. Prentice Hall, Englewood Cliffs (1990)Google Scholar
  18. 18.
    Henry, S., Kafura, D.: Software Structure Metrics based on Information Flow. IEEE Trans. on S.E. 7(5), 510 (1981)CrossRefGoogle Scholar
  19. 19.
    McClure, C.: Managing Software Development and Maintenance, van Nostrand Reinhold, New York, p. 82 (1981)Google Scholar
  20. 20.
    Myers, G.J.: Software Reliability – Principles and Practices, p. 92. John Wiley & Sons, New York (1976)Google Scholar
  21. 21.
    Sneed, H.M.: Metriken für die Wiederverwendbarkeit von Softwaresystemen. Informatikspektrum 6, S 18–20 (1997)Google Scholar
  22. 22.
    Sneed, H., Jungmayr, S.: Produkt- und Prozessmetriken für den Softwaretest. Informatikspektrum, Band 29(1), 23 (2006)Google Scholar
  23. 23.
    Sneed, H.: Software-Projektkalkulation, p. 159. Hanser Verlag, München (2005)Google Scholar
  24. 24.
    Basili, V., Caldiera, C., Rombach, H.D.: Goal Question Metric Paradigm. Encyclopedia of Software Engineering, 528 (1994)Google Scholar
  25. 25.
    Ebert, C., Dumke, R.: Software Measurement, p. 471. Springer, Berlin (2007)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Harry M. Sneed
    • 1
  1. 1.ANECON GmbH Institut für WirtschaftsinformatikUniversität RegensburgViennaAustria

Personalised recommendations