Maintaining and Evolving Measurement Programs

  • Miroslaw Staron
  • Wilhelm Meding


There is a broad misconception about measurement programs as many believe that it is enough to set up measurement systems, and then you have (more or less) a measurement program in place. Or, that after hard work the established measurement program will live and thrive forever. This cannot be more far from the truth. The way in which measurement programs are introduced, maintained, and evolved in companies and organizations is of the utmost importance. Keywords like soft issues, involvement, respect, responsiveness, and evolution must be part of the everyday work with the measurement program. These are the topics we address in this chapter


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [AMVP03]
    Kenneth L. Atkins, Bredt D. Martin, Joseph M. Vellinga, and Rick A. Price. Stardust: implementing a new manage-to-budget paradigm. Acta Astronautica, 52(2-6):87–97, 2003. TY - JOUR.Google Scholar
  2. [Coc06]
    Alistair Cockburn. Agile software development: The cooperative game. Pearson Education, 2006.Google Scholar
  3. [FKDW06]
    Ayaz Farooq, Steffen Kernchen, Reiner R Dumke, and Cornelius Wille. Web services based measurement for it quality assurance. In Proceedings of the International Conference on Software Process and Product Measurement (MENSURA 2006), pages 241–251, 2006.Google Scholar
  4. [GKMG02]
    A. Gopal, M. S. Krishnan, T. Mukhopadhyay, and D. R. Goldenson. Measurement programs in software development: determinants of success. IEEE Transactions on Software Engineering, 28(9):863–875, 2002. 0098-5589.Google Scholar
  5. [GLZ+10]
    Chunye Gong, Jie Liu, Qiang Zhang, Haitao Chen, and Zhenghu Gong. The characteristics of cloud computing. In Parallel Processing Workshops (ICPPW), 2010 39th International Conference on, pages 275–279. IEEE, 2010.Google Scholar
  6. [GMK05]
    A. Gopal, T. Mukhopadhyay, and M. S. Krishnan. The impact of institutional forces on software metrics programs. IEEE Transactions on Software Engineering, 31(8):679–694, 2005. 0098-5589.Google Scholar
  7. [Jor99]
    M. Jorgensen. Software quality measurement. Advances in Engineering Software, 30(12):907–912, 1999.Google Scholar
  8. [KS02]
    Ludwik Kuzniarz and Miroslaw Staron. On practical usage of stereotypes in uml-based software development. In Forum on Design and Specification Languages, 2002.Google Scholar
  9. [MS10]
    Niklas Mellegård and Miroslaw Staron. Characterizing model usage in embedded software engineering: a case study. In Proceedings of the Fourth European Conference on Software Architecture: Companion Volume, pages 245–252. ACM, 2010.Google Scholar
  10. [MST12]
    Niklas Mellegard, Miroslaw Staron, and Fredrik Torner. A light-weight defect classification scheme for embedded automotive software and its initial evaluation. In Software Reliability Engineering (ISSRE), 2012 IEEE 23rd International Symposium on, pages 261–270. IEEE, 2012.Google Scholar
  11. [PSSM10]
    Kosta Pandazo, Arisa Shollo, Miroslaw Staron, and Wilhelm Meding. Presenting software metrics indicators: a case study. In Proceedings of the 20th International Conference on Software Product and Process Measurement (MENSURA), volume 20, 2010.Google Scholar
  12. [SKT05]
    Miroslaw Staron, Ludwik Kuzniarz, and Christian Thurn. An empirical assessment of using stereotypes to improve reading techniques in software inspections. In ACM SIGSOFT Software Engineering Notes, volume 30, pages 1–7. ACM, 2005.Google Scholar
  13. [SM09a]
    Miroslaw Staron and Wilhelm Meding. Ensuring reliability of information provided by measurement systems. In Software Process and Product Measurement, pages 1–16. Springer, 2009.Google Scholar
  14. [SM09b]
    Miroslaw Staron and Wilhelm Meding. Using models to develop measurement systems: A method and its industrial use. 5891:212–226, 2009.Google Scholar
  15. [SM11]
    Miroslaw Staron and Wilhelm Meding. Factors determining long-term success of a measurement program: an industrial case study. e-Informatica Software Engineering Journal, pages 7–23, 2011.Google Scholar
  16. [SM14]
    Miroslaw Staron and Wilhelm Meding. MetricsCloud: Scaling-up metrics dissemination in large organizations. Advances in Software Engineering, 2014, 2014.Google Scholar
  17. [SMH+13]
    Miroslaw Staron, Wilhelm Meding, Jörgen Hansson, Christoffer Höglund, Kent Niesel, and Vilhelm Bergmann. Dashboards for continuous monitoring of quality for software product under development. System Qualities and Software Architecture (SQSA), 2013.Google Scholar
  18. [SMN08]
    Miroslaw Staron, Wilhelm Meding, and Christer Nilsson. A framework for developing measurement systems and its industrial evaluation. Information and Software Technology, 51(4):721–737, 2008.Google Scholar
  19. [SMP12a]
    Miroslaw Staron, Wilhelm Meding, and Klas Palm. Release readiness indicator for mature agile and lean software development projects. In Agile Processes in Software Engineering and Extreme Programming, pages 93–107. Springer, 2012.Google Scholar
  20. [SMP12b]
    Miroslaw Staron, Wilhelm Meding, and Klas Palm. Release readiness indicator for mature agile and lean software development projects. In Agile Processes in Software Engineering and Extreme Programming, pages 93–107. Springer, 2012.Google Scholar
  21. [SMSN13]
    Yuta Sakamoto, Shinichi Matsumoto, Sachio Saiki, and Mitsutoshi Nakamura. Visualizing software metrics with service-oriented mining software repository for reviewing personal process. In Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD), 2013 14th ACIS International Conference on, pages 549–554. IEEE, 2013.Google Scholar
  22. [SMT+17]
    Miroslaw Staron, Wilhelm Meding, Matthias Tichy, Jonas Bjurhede, Holger Giese, and Ola Söder. Industrial experiences from evolving measurement systems into self-healing systems for improved availability. Software: Practice and Experience, 2017.Google Scholar
  23. [SNM15]
    Miroslaw Staron, Kent Niesel, and Wilhelm Meding. Selecting the right visualization of indicators and measures – Dashboard selection model. In Software Measurement, pages 130–143. Springer, 2015.Google Scholar
  24. [UE05]
    M. Umarji and H. Emurian. Acceptance issues in metrics program implementation. In H. Emurian, editor, 11th IEEE International Symposium Software Metrics, pages 10–17, 2005.Google Scholar
  25. [YOL13]
    Young Bae Yoon, Junseok Oh, and Bong Gyou Lee. The establishment of security strategies for introducing cloud computing. KSII Transactions on Internet and Information Systems (TIIS), 7(4):860–877, 2013.Google Scholar
  26. [ZZ09]
    Liang-Jie Zhang and Qun Zhou. CCOA: Cloud computing open architecture. In Web Services, 2009. ICWS 2009. IEEE International Conference on, pages 607–616. IEEE, 2009.Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Miroslaw Staron
    • 1
  • Wilhelm Meding
    • 2
  1. 1.Department of Computer Science and EngineeringUniversity of GothenburgGothenburgSweden
  2. 2.Ericsson ABGothenburgSweden

Personalised recommendations