Lessons from the Study

  • Poul Erik MouritzenEmail author
  • Niels Opstrup
Part of the Public Sector Organizations book series (PSO)


The practical and theoretical lessons from the study are summarized in this chapter. What can policymakers learn from the Danish case? The paradoxes and contradictory goals that may arise when a performance-based research funding system such as the BRI is introduced are discussed. The second set of lessons is those relevant for university management, in particular related to the problems of using systems like the Bibliometric Research Indicator vis-à-vis individuals. The chapter concludes with a short discussion of lessons for future research.


Lessons for policymakers University policymaking Paradoxes of measurement Seduction by numbers Indicators in management Implementation 


  1. Aagaard, Kaare 2018. “Performance-Based Research Funding in Denmark: The Adoption and Translation of the Norwegian Model”. Journal of Data and Information Science 3(4): 19–29.CrossRefGoogle Scholar
  2. Emmeche, Claus. 2009. “Mareridt, damage control eller forskningsrelevante kvalitetskriterier?” Notat om faggruppernes forbehold overfor den bibliometriske forskningsindikator efter niveaudelingsprocessen og indtastning af tidskriftlisterne pr. 15/9-2009. Accessed January 31, 2019.
  3. Esarey, Justin. 2017. “Does Peer Review Identify the Best Papers? A Simulation Study of Editors, Reviewers, and the Scientific Publication Process”. PS: Political Science & Politics 50(4): 963–969.Google Scholar
  4. Gläser, Jochen, and Grit Laudel. 2016. “Governing Science. How Science Policy Shapes Research Content”. European Journal of Sociology 57(1): 117–168.CrossRefGoogle Scholar
  5. Gläser, Jochen, Thomas H. Spurling, and Linda Butler. 2004. “Intraorganisational Evaluation: Are There ‘Least Evaluable Units’?” Research Evaluation 13(1): 19–32.CrossRefGoogle Scholar
  6. Mouritzen, Poul Erik, Niels Opstrup, and Pernille Bak Pedersen. 2018. En fremmed kommer til byen. Ti år med den bibliometriske forskningsindikator. Odense: Syddansk Universitetsforlag.Google Scholar
  7. Osterloh, Margit. 2010. “Governance by Numbers. Does It Really Work in Research?” Analyse & Kritik 2: 267–283.Google Scholar
  8. Osterloh, Margit, and Alfred Kieser. 2015. “Double-Blind Peer Review: How to Slaughter a Sacred Cow”, pp. 307–325 in Isabel Welpe, Jutta Wollersheim, Stefanie Ringelhan, and Margit Osterloh (Eds.) Incentives and Performance—Governance of Research Organizations. New York: Springer.Google Scholar
  9. O’Toole, Laurence J. 2012. “Interorganizational Relations and Policy Implementation”, pp. 292–304 in B. Guy Peters and Jon Pierre (Eds.) The Sage Handbook of Public Administration. London: Sage.Google Scholar
  10. Retningslinjer. 2017. Retningslinjer for forskningsregistrering til Den Bibliometriske Forskningsindikator. Accessed November 21, 2017.
  11. Sauder, Michael, and Wendy Nelson Espeland. 2009. “The Discipline of Rankings: Tight Coupling and Organizational Change”. American Sociological Review 74(1): 63–82.CrossRefGoogle Scholar
  12. Sivertsen, Gunnar, and Jesper Schneider. 2012. Evaluering av den bibliometriske forskningsindikator. NIFU Rapport 17, 2012.Google Scholar
  13. Stern, Nicholas. 2016. Building on Success and Learning from Experience. An Independent Review of the Research Excellence Framework. Department for Business, Energy & Industrial Strategy, London, UK. Accessed February 1, 2019.
  14. Weaver, R. Kent. 2015. “Getting People to Behave: Research Lessons for Policy Makers”. Public Administration Review 75(6): 806–816.CrossRefGoogle Scholar
  15. Wouters, Paul, et al. 2015. The Metric Tide: Literature Review (Supplementary Report I to the Independent Review of the Role of Metrics in Research Assessment and Management). HEFCE.

Copyright information

© The Author(s) 2020

Authors and Affiliations

  1. 1.Department of Political ScienceAarhus UniversityAarhusDenmark
  2. 2.Department of Political Science and Public ManagementUniversity of Southern DenmarkOdenseDenmark

Personalised recommendations