Advertisement

Strategic Decisions About Research and Publications

  • Poul Erik MouritzenEmail author
  • Niels Opstrup
Chapter
  • 106 Downloads
Part of the Public Sector Organizations book series (PSO)

Abstract

In this chapter, the focus is on how the research strategies of Danish university researchers have changed since the introduction of the BRI and what effect the local implementation of the BRI system has on university scholars’ research and publication decisions. The use of different ‘counterstrategies’ is also studied, including the use of slicing strategies or so-called salami publication and how such strategies have developed since the BRI was introduced. Overall, the introduction of the BRI system does not seem to have resulted in major changes. However, local implementation seems to matter. The more strongly the BRI is implemented at the department level, the more weight the department’s researchers place on quantity relative to quality. Similarly, stronger implementation is associated with a lower weight on long-term research processes relative to those which can lead to quick publications.

Keywords

Unintended consequences Research strategies Dysfunctional behaviour Goal displacement Slicing 

Literature

  1. Aagaard, Kaare, Carter Bloch, Jesper W. Schneider, Dorte Henriksen, Thomas Kjeldager Ryan, and Per Stig Lauridsen. 2014. Evaluering af den norske publiceringsindikator. Aarhus: Dansk Center for Forskningsanalyse, Aarhus University.Google Scholar
  2. Auken, Sune, and Claus Emmeche. 2010. “Mismåling af forskningskvalitet. Sandhed, relevans og normativ validitet i den bibliometriske forskningsindikator”. Kritik 197: 2–12.Google Scholar
  3. Bevan, Gwyn, and Christopher Hood. 2006. “What’s Measured Is What Matters: Targets and Gaming in the English Public Health Care System”. Public Administration 84 (3): 517–538.CrossRefGoogle Scholar
  4. Bloch, Carter W., and Jesper W. Schneider. 2016. “Performance-Based Funding Models and Researcher Behavior: An Analysis of the Influence of the Norwegian Publication Indicator at the Individual Level”. Research Evaluation 25 (4): 371–382.Google Scholar
  5. Bouckaert, Gert, and Walter Balk. 1991. “Public Productivity Measurement: Diseases and Cures”. Public Productivity and Management Review 15 (2): 229–235.CrossRefGoogle Scholar
  6. Burgess, Simon, and Marisa Ratto. 2003. “The Role of Incentives in the Public Sector: Issues and Evidence”. Oxford Review of Economic Policy 19 (2): 285–300.CrossRefGoogle Scholar
  7. Butler, Linda. 2010. “Impacts of Performance-Based Research Funding Systems: A Review of the Concerns and the Evidence”, pp. 127–165 in OECD Workshop Proceedings: Performance-Based Funding Systems for Public Research in Tertiary Education Institutions. Paris: OECD.Google Scholar
  8. Dahler-Larsen, Peter. 2008. Konsekvenser af indikatorer. KREVI-notat. Aarhus: KREVI.Google Scholar
  9. Dahler-Larsen, Peter. 2014. “Constitutive Effects of Performance Indicators”. Public Management Review 16 (7): 969–986.CrossRefGoogle Scholar
  10. de Bruijn, Hans. 2007. Managing Performance in the Public Sector. 2nd ed. London: Routledge.Google Scholar
  11. Engberg, Charlotte, Cathrine Hasse, Jesper Eckhardt Larsen, and Karin Lisa Salamon. 2009. “Dræber. Nye incitamenter svækker forskningen”. Politiken, 2. sektion, p. 18, October 31.Google Scholar
  12. Espeland, Wendy N., and Michael Sauder. 2007. “Rankings and Reactivity: How Public Measures Recreate Social Worlds”. American Journal of Sociology 113 (1): 1–40.CrossRefGoogle Scholar
  13. Frey, Bruno S. 2003. “Publishing as Prostitution?—Choosing Between One’s Own Ideas and Academic Success”. Public Choice 116 (1–2): 205–223.CrossRefGoogle Scholar
  14. Gibbons, Robert. 1998. “Incentives in Organizations”. The Journal of Economic Perspectives 12 (4): 115–132.CrossRefGoogle Scholar
  15. Gilles, D. 2008. How Should Research Be Organised? London: College Publications.Google Scholar
  16. Gläser, Jochen, and Grit Laudel. 2007. “Evaluation Without Evaluators: The Impact of Funding Formulae on Australian University Research”, pp. 127–151 in Richard Whitley and Jochen Gläser (Eds.) The Changing Governance of the Sciences: The Advent of Research Evaluation Systems. Dordrecht: Springer.Google Scholar
  17. Gläser, Jochen, and Grit Laudel. 2016. “Governing Science: How Science Policy Shapes Research Content”. European Journal of Sociology 57 (1): 117–168.CrossRefGoogle Scholar
  18. Grizzle, Gloria A. 2002. “Performance Measurement and Dysfunction”. Public Performance and Management Review 25 (4): 363–369.Google Scholar
  19. Heinrich, Carolyn J., and Gerald Marschke. 2010. “Incentives and Their Dynamics in Public Sector Performance Management Systems”. Journal of Policy Analysis and Management 29 (1): 183–208.CrossRefGoogle Scholar
  20. Hildebrandt, Sybille. 2011. “Forskere: Nye publiceringskrav sænker forskningskvaliteten”. Videnskab.dk, 7. marts 2017. http://videnskab.dk/kultur-samfund/forskere-nye-publiceringskrav-saenker-forskningskvaliteten. Accessed April 18, 2017.
  21. Holmstrom, Bengt, and Paul Milgrom. 1991. “Multitask Principal-Agent Analyses: Incentive Contracts, Asset Ownership, and Job Design”. Journal of Law Economics and Organization 7: 232–244.CrossRefGoogle Scholar
  22. Kelman, Steven, and John N. Friedman. 2009. “Performance Improvement and Performance Dysfunction: An Empirical Examination of Distortionary Impacts of the Emergency Room Wait-Time Target in the English National Health Service”. Journal of Public Administration Research and Theory 19 (1): 917–946.CrossRefGoogle Scholar
  23. Lawrence, Peter A. 2003. “The Politics of Publication”. Nature 422: 259–261.CrossRefGoogle Scholar
  24. Lewis, Jenny M. 2014. Academic Governance: Disciplines and Policy. London: Routledge Research in Higher Education, Routledge.Google Scholar
  25. Meyer, Marshall W., and Vipin Gupta. 1994. “The Performance Paradox”. Research in Organizational Behavior 16 (X): 309–369.Google Scholar
  26. Mouritzen, Poul Erik, Niels Opstrup, and Pernille Bak Pedersen. 2018. En fremmed kommer til byen. Ti år med den bibliometriske forskningsindikator. Odense: Syddansk Universitetsforlag.Google Scholar
  27. Olesen, Thorsten Borring. 2009. “Historien i skammekrogen”. Jyllands-Posten, 1. sektion, p. 19, March 18.Google Scholar
  28. Osterloh, Margit. 2010. “Governance by Numbers: Does It Really Work in Research?” Analyse & Kritik 32 (2): 267–283.Google Scholar
  29. Osterloh, Margit, and Bruno S. Frey. 2015. “Ranking Games”. Evaluation Review 39 (1): 102–129.CrossRefGoogle Scholar
  30. Perrin, Burt. 1998. “Effective Use and Misuse of Performance Measurement”. American Journal of Evaluation 19 (3): 367–379.CrossRefGoogle Scholar
  31. Smith, Peter. 1995. “On the Unintended Consequences of Publishing Performance Data in the Public Sector”. International Journal of Public Administration 18 (2–3): 277–310.CrossRefGoogle Scholar
  32. Talbot, Colin. 2005. “Performance Management”, pp. 491–517 in Ewan Ferlie, Laurence E. Lynn, Jr., and Christopher Pollitt (Eds.) The Oxford Handbook of Public Management. Oxford: Oxford University Press.Google Scholar
  33. van Thiel, Sandra, and Frans L. Leeuw. 2002. “The Performance Paradox in the Public Sector”. Public Performance & Management Review 25 (3): 267–281.CrossRefGoogle Scholar
  34. Weingart, Peter. 2005. “Impact of Bibliometrics upon the Science System: Inadvertent Consequences?” Scientometrics 62 (1): 117–131.CrossRefGoogle Scholar

Copyright information

© The Author(s) 2020

Authors and Affiliations

  1. 1.Department of Political ScienceAarhus UniversityAarhusDenmark
  2. 2.Department of Political Science and Public ManagementUniversity of Southern DenmarkOdenseDenmark

Personalised recommendations