Transferring FAME, a Methodology for Assessing Open Source Solutions, from University to SMEs

  • Filippo E. PaniEmail author
  • Daniele Sanna
  • Michele Marchesi
  • Giulio Concas
Conference paper


We present FAME (Filter, Analyze, Measure and Evaluate), a simplified approach for Open Source software assessment. This approach has been derived by more heavyweight, proven approaches developed in a University research environment, to match the needs of small organizations. The proposed approach has been developed by CC-ICT-SUD, a consortium for the delivery of technology transfer services, and for transferring advanced methodologies for software evaluation and assessment from academic to industrial contexts, in particular for Small and Medium Enterprises (SME). The FAME methodology is briefly described, and a case study involving the choice of a document management system for a SME is presented, showing how the approach can be used.


Open Source Open Source Software Open Source Project Small Organization Electric Engineer Department 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.



CC - ICT-SUD funded this technology transfer experience.


  1. 1.
    Feller J, Fitzgerald B, Hissam S, Lakhani K (2005) Perspectives on free and open source software. The MIT Press, CambridgeGoogle Scholar
  2. 2.
    FLOSS (2003) Free/Libre and open source software: survey and study: final report. Available at:
  3. 3.
    Senyard A, Michlmayr M (2004) How to have a successful free software project. In: APSEC, IEEE Computer Society, Los Alamitos, pp 84–91Google Scholar
  4. 4.
    Michlmayr M (2005) Software process maturity and the success of free software projects. In: Zielinski K, Szmuc T (eds) Software engineering: evolution and emerging technologies. IOS Press, Amsterdam, pp. 3–14Google Scholar
  5. 5.
    Crowston K, Annabi H, Howison J (2003) Defining open source soft ware project success. In: ICIS, Association for Information Systems, Atlanta, pp 327–340Google Scholar
  6. 6.
    Crowston K, Annabi H, Howison J, Masango C (2004) Towards a portfolio of floss project success measures. Open Source Workshop of the International Conference on Software EngineeringGoogle Scholar
  7. 7.
    QUALOSS (2008)
  8. 8.
    FLOSSMetrics (2008)
  9. 9.
    SQO-OSS (2008)
  10. 10.
    QUALIPSO (2008)
  11. 11.
    Cau A, Concas G, Marchesi M (2006) Extending OpenBRR with automated metrics to measure object oriented open source project success. The workshop on evaluation frameworks for OSS, collocated in the second international conference on OSS2006Google Scholar
  12. 12.
    Ciolkowski M, Soto M (2008) Towards a comprehensive approach for assessing open source projects. In: Dumke RR et al (eds) Proceedings of the international conferences on software process and product measurement. Munich, Germany, November 18–19. Lecture notes in computer science, vol 5338. Springer, Heidelberg, pp 316–330Google Scholar
  13. 13.
    Deprez, JC, Alexandre S (2008) Comparing assessment methodologies for free/open source software: OpenBRR and QSOS. Lecture notes in computer science, Springer, New YorkGoogle Scholar
  14. 14.
    Mannaro, K Concas G, Marchesi M (2006) NVAF: un Framework per una valutazione di tipo comparativo delle soluzioni software nelle Pubbliche Amministrazioni. Esperta 2006, Como, ItalyGoogle Scholar
  15. 15.
    Cau A (2007) EFFLOSS: an evaluation framework for Free/Libre Open Source. PhD ThesisGoogle Scholar
  16. 16.
    Business Readiness Rating (2005) A proposed open standard to facilitate assessment and adoption of open source software.
  17. 17.
    Golden B (2004) Succeeding with open source. Addison-Wesley Information Technology Series, Addison-Wesley Professional, BostonGoogle Scholar
  18. 18.
    Open Source Maturity Model (2000) Published on:
  19. 19.
    QSOS (2006) Method for Qualification and Selection of Open Source software (QSOS) version 1.6 Atos Origin, April 2006.
  20. 20.
    Jones C (2000) Software assessments, benchmarks, and best practices. Addison-Wesley, BostonGoogle Scholar
  21. 21.
    Basili V (1995) Applying the Goal/question/metric Paradigm in the Experience Factory. Software quality assurance and measurement: a worldwide perspective, Chapter 2. International Thomson Computer Press, London, pp 21–44Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Filippo E. Pani
    • 1
    Email author
  • Daniele Sanna
    • 2
  • Michele Marchesi
    • 1
  • Giulio Concas
    • 1
  1. 1.Electronic and Electric Engineering DepartmentUniversity of CagliariCagliariItaly
  2. 2.FlossLab SrlCagliariItaly

Personalised recommendations