• Miroslaw Staron
  • Wilhelm Meding


In this chapter we introduce the problems which are addressed by software measurement—e.g., providing quantitative insights, and we describe the possibilities which open up when we have software measurement of products, processes and enterprise in place. We discuss the possibility of quantitative fact-based management, customer data-driven development and using artificial intelligence (or machine learning) once we have a solid measurement program. Towards the end of the chapter we outline the concept of a company-wide measurement program and introduce the content of the book.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [Abr10]
    Alain Abran. Software Metrics and Software Metrology. John Wiley & Sons, 2010.Google Scholar
  2. [Alb79]
    Allan J Albrecht. Measuring application development productivity. In Proc. of the Joint SHARE/GUIDE/IBM Application Development Symposium, pages 83–92, 1979.Google Scholar
  3. [ASH+14]
    Vard Antinyan, Miroslaw Staron, Jörgen Hansson, Wilhelm Meding, Per Osterström, and Anders Henriksson. Monitoring evolution of code complexity and magnitude of changes. Acta Cybernetica, 21(3):367–382, 2014.Google Scholar
  4. [ASM+14]
    Vard Antinyan, Miroslaw Staron, Wilhelm Meding, Per Österström, Erik Wikstrom, Johan Wranker, Anders Henriksson, and Jörgen Hansson. Identifying risky areas of software code in agile/lean software development: An industrial experience report. In Software Maintenance, Reengineering and Reverse Engineering (CSMR-WCRE), 2014 Software Evolution Week-IEEE Conference on, pages 154–163. IEEE, 2014.Google Scholar
  5. [B+81]
    Barry W Boehm et al. Software engineering economics, volume 197. Prentice-hall Englewood Cliffs (NJ), 1981.Google Scholar
  6. [Boe84]
    Barry W Boehm. Software engineering economics. IEEE transactions on Software Engineering, (1):4–21, 1984.Google Scholar
  7. [Bos16]
    Jan Bosch. Speed, data, and ecosystems: The future of software engineering. IEEE Software, 33(1):82–88, 2016.Google Scholar
  8. [CK94]
    Shyam R Chidamber and Chris F Kemerer. A metrics suite for object oriented design. Software Engineering, IEEE Transactions on, 20(6):476–493, 1994.Google Scholar
  9. [Col01]
    James Charles Collins. Good to great: Why some companies make the leap…and others don’t. Random House, 2001.Google Scholar
  10. [ED07]
    Christof Ebert and Reiner Dumke. Software Measurement: Establish-Extract-Evaluate-Execute. Springer Science & Business Media, 2007.Google Scholar
  11. [EDBS05]
    Christof Ebert, Reiner Dumke, Manfred Bundschuh, and Andreas Schmietendorf. Best Practices in Software Measurement: How to use metrics to improve project and process performance. Springer Science & Business Media, 2005.Google Scholar
  12. [FB14]
    Norman Fenton and James Bieman. Software metrics: A rigorous and practical approach. CRC Press, 2014.Google Scholar
  13. [Fen96]
    Norman E Fenton. Software metrics: A practical and rigorous approach. International Thomson Pub., 1996.Google Scholar
  14. [FSHL13]
    Robert Feldt, Miroslaw Staron, Erika Hult, and Thomas Liljegren. Supporting software decision meetings: Heatmaps for visualising test and code measurements. In Software Engineering and Advanced Applications (SEAA), 2013 39th EUROMICRO Conference on, pages 62–69. IEEE, 2013.Google Scholar
  15. [Gil05]
    Tom Gilb. Competitive engineering: A handbook for systems engineering, requirements engineering, and software engineering using Planguage. Butterworth-Heinemann, 2005.Google Scholar
  16. [Har94]
    Mikel J Harry. The vision of six sigma: Tools and methods for breakthrough. Sigma Pub. Co., 1994.Google Scholar
  17. [kim]
    Blue ocean strategy: How to create uncontested market space and make the competition irrelevant.Google Scholar
  18. [McA00]
    Donald R McAndrews. The Team Software ProcessSM (TSPSM): An Overview and Preliminary Results of Using Disciplined Practices. Technical report, CARNEGIE-MELLON UNIV PITTSBURGH PA SOFTWARE ENGINEERING INST, 2000.Google Scholar
  19. [McG02]
    John McGarry. Practical software measurement: Objective information for decision makers. Addison-Wesley Professional, 2002.Google Scholar
  20. [oWM93]
    International Bureau of Weights and Measures. International vocabulary of basic and general terms in metrology. International Organization for Standardization, Geneve, Switzerland, 2nd edition, 1993.Google Scholar
  21. [RGMR16]
    Tammy L Rapp, Lucy L Gilson, John E Mathieu, and Thomas Ruddy. Leading empowered teams: An examination of the role of external team leaders and team coaches. The Leadership Quarterly, 27(1):109–123, 2016.Google Scholar
  22. [RS05]
    Gunther Ruhe and Moshood Omolade Saliu. The art and science of software release planning. IEEE software, 22(6):47–53, 2005.Google Scholar
  23. [SHF+13]
    Miroslaw Staron, Jorgen Hansson, Robert Feldt, Anders Henriksson, Wilhelm Meding, Sven Nilsson, and Christoffer Hoglund. Measuring and visualizing code stability–a case study at three companies. In Software Measurement and the 2013 Eighth International Conference on Software Process and Product Measurement (IWSM-MENSURA), 2013 Joint Conference of the 23rd International Workshop on, pages 191–200. IEEE, 2013.Google Scholar
  24. [SM11]
    Miroslaw Staron and Wilhelm Meding. Factors determining long-term success of a measurement program: an industrial case study. e-Informatica Software Engineering Journal, pages 7–23, 2011.Google Scholar
  25. [SM16]
    Miroslaw Staron and Wilhelm Meding. MeSRAM – A method for assessing robustness of measurement programs in large software development organizations and its industrial evaluation. Journal of Systems and Software, 113:76–100, 2016.Google Scholar
  26. [SMH+14]
    Miroslaw Staron, Wilhelm Meding, Jörgen Hansson, Christoffer Höglund, Kent Niesel, and Vilhelm Bergmann. Dashboards for continuous monitoring of quality for software product under development. System Qualities and Software Architecture (SQSA), 2014.Google Scholar
  27. [SMN08]
    Miroslaw Staron, Wilhelm Meding, and Christer Nilsson. A framework for developing measurement systems and its industrial evaluation. Information and Software Technology, 51(4):721–737, 2008.Google Scholar
  28. [SMP12]
    Miroslaw Staron, Wilhelm Meding, and Klas Palm. Release readiness indicator for mature agile and lean software development projects. In Agile Processes in Software Engineering and Extreme Programming, pages 93–107. Springer, 2012.Google Scholar
  29. [SPA11]
    Anna Sandberg, Lars Pareto, and Thomas Arts. Agile collaborative research: Action principles for industry-academia collaboration. Software, IEEE, 28(4):74–83, 2011.Google Scholar
  30. [Sta12]
    Miroslaw Staron. Critical role of measures in decision processes: Managerial and technical measures in the context of large software development organizations. Information and Software Technology, 54(8):887–899, 2012.Google Scholar
  31. [Tes14]
    Bjørnar Tessem. Individual empowerment of agile and non-agile software developers in small teams. Information and software technology, 56(8):873–889, 2014.Google Scholar
  32. [TJ14]
    Adam Trendowicz and Ross Jeffery. Software project effort estimation: Foundations and best practice guidelines for success. Springer, 2014.Google Scholar
  33. [Gil76]
    Tom Gilb. Software metrics. Studentlitteratur AB Sweden, 1976.Google Scholar
  34. [Tre13]
    Adam Trendowicz. Software Cost Estimation, Benchmarking, and Risk Assessment: The Software Decision-Makers’ Guide to Predictable Software Development. Springer Science & Business Media, 2013.Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Miroslaw Staron
    • 1
  • Wilhelm Meding
    • 2
  1. 1.Department of Computer Science and EngineeringUniversity of GothenburgGothenburgSweden
  2. 2.Ericsson ABGothenburgSweden

Personalised recommendations