Enhancing the Conceptual Framework Capability for a Measurement and Evaluation Strategy

  • Pablo Becker
  • Fernanda Papa
  • Luis Olsina
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8295)


To provide consistency and repeatability for measurement and evaluation (M&E) projects and programs a well-established M&E strategy is needed. In a previous work, we have discussed the benefits of having an integrated M&E strategy that relies on three capabilities such as an M&E conceptual framework, process and method specifications. Besides, we have developed GOCAME (Goal-Oriented Context-Aware Measurement and Evaluation) as an integrated M&E strategy which supports these capabilities. In the present work, we enhance its former conceptual framework with the recently built process ontology, enriching also the M&E terms with stereotypes stemming from the process conceptual base. The augmented conceptual framework has also a positive impact on the other strategy capabilities since ensures terminological uniformity and testability to process and method specifications. For illustration purposes, excerpts of process specifications regarding the new situation are highlighted.


Process Ontology Quality Measurement Evaluation GOCAME C-INCAMI 


  1. 1.
    Curtis, B., Kellner, M., Over, J.: Process Modelling. Com. of ACM 35(9), 75–90 (1992)CrossRefGoogle Scholar
  2. 2.
    Papa, F.: Toward the Improvement of a Measurement and Evaluation Strategy from a Comparative Study. In: Grossniklaus, M., Wimmer, M. (eds.) ICWE 2012 Workshops. LNCS, vol. 7703, pp. 189–203. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  3. 3.
    Olsina, L., Papa, F., Molina, H.: How to Measure and Evaluate Web Applications in a Consistent Way. In: Rossi, Pastor, Schwabe, Olsina (eds.) Springer HCIS Book Web Engineering: Modeling and Implementing Web Applications, pp. 385–420 (2008)Google Scholar
  4. 4.
    Becker, P., Lew, P., Olsina, L.: Specifying Process Views for a Measurement, Evaluation and Improvement Strategy. Advances in Software Engineering, Software Quality Assurance Methodologies and Techniques 2012, 1–27 (2012)Google Scholar
  5. 5.
    Olsina, L., Martin, M.: Ontology for Software Metrics and Indicators. Journal of Web Engineering 2(4), 262–281 (2004)Google Scholar
  6. 6.
    OMG-SPEM: Software & Systems Process Engineering Meta-Model Specification V2.0 (2008)Google Scholar
  7. 7.
    CMMI Product Team: CMMI for Development, Ver. 1.3 (CMU/SEI-2010-TR-033). The Software Engineering Institute, Carnegie Mellon University website (2010), (retrieved February 06, 2013)
  8. 8.
    ISO/IEC 15504-5: Information technology - Process assessment - Part 5: An exemplar software life cycle process assessment model (2012)Google Scholar
  9. 9.
    Bringuente, A., Falbo, R., Guizzardi, G.: Using a Foundational Ontology for Reengineering a Software Process Ontology. Journal of Information and Data Management 2, 511–526 (2011)Google Scholar
  10. 10.
    ISO/IEC 12207: Systems and software engineering - Software life cycle processes (2008)Google Scholar
  11. 11.
    Olsina, L.: Functional View of the Hypermedia Process Model. In: The Fifth International Workshop on Engineering Hypertext Functionality at International Conference on Software Engineering (ICSE 1998), Kyoto, Japan, pp. 1–10 (1998)Google Scholar
  12. 12.
    Feiler, P.H., Humphrey, W.S.: Software Process Development and Enactment: Concepts and Definitions. In: International Conference of Software Process (ICSP), pp. 28–40. IEEE Computer Society, Berlin (1993)Google Scholar
  13. 13.
    Lonchamp, J.: A Structured Conceptual and Terminological Framework for Software Process Engineering. In: International Conference on the Software Process (ICSP), pp. 41–53. IEEE Computer Society Press, Berlin (1993)Google Scholar
  14. 14.
    OMG-UML: Unified Modeling Language, Superstructure, v2.3 (2010)Google Scholar
  15. 15.
    Barcellos, M.P., Falbo, R., Dal Moro, R.: A Well-Founded Software Measurement Ontology. In: Galton, A., Mizoguchi, R. (eds.) Proceedings of the Sixth International Conference FOIS 2010, pp. 213–226. IOS Press, Amsterdam (2010)Google Scholar
  16. 16.
    Guizzardi, G., Falbo, R., Guizzardi, R.: Grounding Software Domain Ontologies in the Unified Foundational Ontology (UFO): The case of the ODE Software Process Ontology. In: Proceedings de la XI Conferencia Iberoamericana de Software Engineering (CIbSE 2008), pp. 127–140 (2008)Google Scholar
  17. 17.
    ISO/IEC 15939: Software Engineering - Software Measurement Process (2002)Google Scholar
  18. 18.
    ISO/IEC 25000: Software Engineering - Software product Quality Requirements and Evaluation (SQuaRE) - Guide to SQuaRE (2005)Google Scholar
  19. 19.
    ISO/IEC 14598-5: IT - Software product evaluation - Part 5: Process for evaluators (1998)Google Scholar
  20. 20.
    Molina, H., Olsina, L.: Assessing Web Applications Consistently: A Context Information Approach. In: 8th Int’l Congress on Web Engineering, NY, US, pp. 224–230. IEEE CS (2008)Google Scholar
  21. 21.
    García, F., Piattini, M., Ruiz, F., Canfora, G., Visaggio, C.A.: FMESP: Framework for the modeling and evaluation of software processes. Journal of Systems Architecture 52(11), 627–639 (2006)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  • Pablo Becker
    • 1
  • Fernanda Papa
    • 1
  • Luis Olsina
    • 1
  1. 1.GIDIS_Web, Engineering SchoolUNLPamLa PampaArgentina

Personalised recommendations