Advertisement

The Power of Research Quality Assessments in Shaping Research Agendas

  • Carolyn KaganEmail author
  • John Diamond
Chapter
Part of the Rethinking University-Community Policy Connections book series (REUNCOPOCO)

Abstract

This chapter overviews ways in which community engagement has been affected by policies about research, themselves dominated by concerns about how to allocate funding for what kinds of research, in the context of the expansion of higher education. Funding has been tied to assessments of quality, and frameworks for assessing research quality have evolved over the years and have distorted research activity in a number of ways. In parallel, policy has both constructed and reflected political and public concerns about value for money and the impact of research. The roles played by academic publishers in advancing a metrics approach to the management of research is reviewed.

Keywords

Research Funding Research selectivity RAE REF Capability Academic publishers Impact 

References

  1. Allison, L. 2017. “Research Assessment: An Exercise in Futility.” Times Higher Education, 2017. https://www.timeshighereducation.com/features/research-assessment-exercise-futility.
  2. Bence, V., and C. Oppenheim. 2005. “The Evolution of the UK’s Research Assessment Exercise: Publications, Performance and Perceptions.” Journal of Educational Administration and History 37 (2): 137–155.  https://doi.org/10.1080/00220620500211189.
  3. Boliver, V. 2015. “Are There Distinctive Clusters of Higher and Lower Status Universities in the UK?” Oxford Review of Education 41 (5): 608–627.  https://doi.org/10.1080/03054985.2015.1082905.
  4. Brown, R., and H. Carasso. 2013. Everything for Sale? The Marketisation of UK Higher Education. Research into Higher Education. Milton Park, Abingdon, Oxon; New York: Routledge.Google Scholar
  5. Brownridge, K., S. Chaytor, A. Clements, L. Colledge, S. Conway, N. Fowler, R. Green, et al. 2010. “Agreeing Metrics for Research Information Management: The Snowball Project.” Elsevier. https://www.snowballmetrics.com/wp-content/uploads/The-Snowball-Project3.pdf.
  6. Clarivate Analytics. 2018. “Back to the Future: Institute for Scientific Information Re-Established Within Clarivate Analytics.” Clarivate Analytics: News (blog). February 7, 2018. https://clarivate.com/blog/news/back-future-institute-scientific-information-re-established-within-clarivate-analytics/.
  7. Clements, A., P.I. Darroch, and J. Green. 2017. “Snowball Metrics—Providing a Robust Methodology to Inform Research Strategy—but Do They Help?” Procedia Computer Science 106 (January): 11–18.  https://doi.org/10.1016/j.procs.2017.03.003 (13th International Conference on Current Research Information Systems, CRIS2016, Communicating and Measuring Research Responsibly: Profiling, Metrics, Impact, Interoperability).
  8. Cuenin, S. 1987. “The Case of Performance Indicators in Universities: An International Survey.” International Journal of Institutional Management 11(2), 117–139.Google Scholar
  9. Earle, J., C. Moran, and Z. Ward-Perkins. 2017. The Econocracy: The Perils of Leaving Economics to the Experts. Manchester Capitalism. Manchester: Manchester University Press.Google Scholar
  10. Elliott, T., and M. Pedler. 2018. “Collaborative Knowledge and Intellectual Property: An Action Learning Conundrum.” Action Learning: Research and Practice 15 (1): 18–27.  https://doi.org/10.1080/14767333.2017.1363717.
  11. Elsevier. 2013. “International Comparative Performance of the UK Research Base—2013. A Report Prepared by Elsevier for the UK’s Department of Business, Innovation and Skills (BIS).” Elsevier. https://www.snowballmetrics.com/wp-content/uploads/bis2.pdf.
  12. ———. 2018. “Empowering Knowledge.” Elsevier: About and Elsevier: Books Aned Journals 2018. https://www.elsevier.com/en-gb/about.
  13. Elton, L. 2000. “The UK Research Assessment Exercise: Unintended Consequences.” Higher Education Quarterly 54 (3): 274–283.  https://doi.org/10.1111/1468-2273.00160.
  14. Farla, K., and P. Simmonds. 2015. “REF Accountability Review: Costs, Benefits and Burden Report by Technopolis to the Four UK Higher Education Funding Bodies.” Technopolis.Google Scholar
  15. Finch, J. 2012. “Accessibility, Sustainability, Excellence: How to Expand Access to Research Publications. Report of the Working Group on Expanding Access to Published Research Findings (the Finch Report).” Department of Business, Innovation and Skills. https://www.acu.ac.uk/research-information-network/finch-report-final.
  16. Grant, B. 2016. “Web of Science Sold for More Than $3 Billion.” The Scientist: Exploring Life, Inspiring Innovation. July 15, 2016. https://www.the-scientist.com/the-nutshell/web-of-science-sold-for-more-than-3-billion-33184.
  17. Grant, J., P-B. Brutscher, S.E. Kirk, L. Butler, and S. Wooding. 2010. Capturing Research Impacts A Review of International Practice. Cambridge: Rand Corporation for HEFCE. https://www.rand.org/content/dam/rand/pubs/documented_briefings/2010/RAND_DB578.pdf.
  18. Green, J., I. McArdle, S. Rutherford, T. Turner, J. van Baren, N. Fowler, P. Govaert, and N. Weertman. 2010. “Research Information Management. Developing Tools to Inform the Management of Research and Translating Existing Good Practice.” https://www.snowballmetrics.com/wp-content/uploads/research-information-management1.pdf.
  19. HC. 2002. “Science and Technology Committee Second Report.” House of Commons, HC 507. https://publications.parliament.uk/pa/cm200102/cmselect/cmsctech/507/50702.htm.
  20. ———. 2004. “Research Assessment Exercise: A Reassessment. Science and Technology Committee. Eleventh Report of Session 2003–2004.” House of Commons, HC 586. https://publications.parliament.uk/pa/cm200304/cmselect/cmsctech/586/586.pdf.
  21. HEFCE. 2003. “Research Capability Fund Request for Strategies.” HEFCE Circular Letter 10/23. http://webarchive.nationalarchives.gov.uk/20120118183729/http://www.hefce.ac.uk/pubs/circlets/2003/cl10_03.htm.
  22. ———. 2005. “Continuation of Research Capability Funding until 2008–2009.” HEFCE Circular Letter 29/2005. http://webarchive.nationalarchives.gov.uk/20120118183811/http://www.hefce.ac.uk/pubs/circlets/2005/cl29_05/.
  23. ———. 2007. “Future Framework for Research Assessment and Funding.” HEFCE Circular Letter 06/2007. http://webarchive.nationalarchives.gov.uk/20100303171159/http://www.hefce.ac.uk/pubs/circlets/2007/cl06_07/.
  24. ———. 2009. “Research Excellence Framework. Second Consultation on the Assessment and Funding of Research.” HEFCE Circular Letter 2009/38. http://webarchive.nationalarchives.gov.uk/20120118164911/http://www.hefce.ac.uk/pubs/hefce/2009/09_38/.
  25. ———. 2016a. “Impact Case Studies.” REF Impact (blog). http://www.hefce.ac.uk/rsrch/REFimpact/.
  26. ———. 2016b. “Policy for Open Access in the next Research Excellence Framework: Updated November 2016.” HEFCE Guidance 2016/35 (blog). http://www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/2016/201635/HEFCE2016_35.pdf.
  27. ———. 2017. “Initial Decisions on the Research Excellence Framework 2021.” HEFCE, REF 2017/01. http://www.ref.ac.uk/publications/2017/initialdecisionsontheresearchexcellenceframework2021.html.
  28. HESA. n.d. “Guide to the UKPIs.” HESA Data and Analysis. https://www.hesa.ac.uk/data-and-analysis/performance-indicators/guide.
  29. Huggett, S. 2012. “Impact Factor Ethics for Editors: How Impact Factor Engineering Can Damage a Journal’s Reputation.” Elsevier: Editors’ Update (blog). https://www.elsevier.com/editors-update/story/journal-metrics/impact-factor-ethics-for-editors.
  30. Johnston, J., and A. Reeves. 2018. “An Investigation into the Role Played by Research Assessment in the Socio-Geographic Fragmentation of Undergraduate Economics Education in the UK.” Higher Education 76 (4): 589–614.  https://doi.org/10.1007/s10734-017-0227-9.
  31. Joint Funding Councils. 2012. “Panel Criteria and Working Methods. REF 01/2012.” Joint Funding Councils. https://www.ref.ac.uk/2014/media/ref/content/pub/panelcriteriaandworkingmethods/01_12.pdf.
  32. Jump, P. 2013. “Evolution of the REF.” Times Higher Education, October 17, 2013. https://www.timeshighereducation.com/features/evolution-of-the-ref/2008100.article.
  33. King’s College London, and Digital Science. 2015. The Nature, Scale and Beneficiaries of Research Impact: An Initial Analysis of Research Excellence Framework (REF) 2014 Impact Case Studies. Bristol: HEFCE. http://www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/Independentresearch/2015/Analysis,of,REF,impact/Analysis_of_REF_impact.pdf.
  34. Levin, M., and D. Greenwood. 2018. Creating a New Public University and Reviving Democracy. Action Research in Higher Education. New York: Berghahn Books.Google Scholar
  35. Nurse, P. 2015. Ensuring a Successful UK Research Endeavour. A Review of the UK Research Councils. London: Department of Business, Innovation and Skills. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/478125/BIS-15-625-ensuring-a-successful-UK-research-endeavour.pdf.
  36. OUP. 2018. “OUP Announces Latest Impact Factors.” Oxford University Press, Oxford Academic: Announcement. 2018. https://academic.oup.com/journals/pages/announcements_from_oup/oup_announces_impact_factor_results.
  37. PA Consulting Group. 2008. “RAE 2008 Accountability Review.” PA Consulting Group. http://www.hefce.ac.uk/media/hefce/content/pubs/2009/rd0809/rd08_09.pdf.
  38. Pollard, E., M. Williams, J. Williams, C. Bertram, J Buzzeo, E. Drever, J. Griggs, and S. Coutinho. 2013. How Should We Measure Higher Education? A Fundamental Review of the Performance Indicators Part Two: The Evidence Report. Brighton: Institute for Employment Studies and National Centre for Economic and Social Research. http://dera.ioe.ac.uk/18967/2/2013_ukpireview2.pdf.
  39. Roberts, G. 2003. “The Roberts Review. Review of Research Assessment. Report by Sir Gareth Roberts to the UK Funding Bodies Issued for Consultation May 2003.” UK HE Funding Bodies. http://www.ra-review.ac.uk/reports/roberts/roberts_summary.pdf.
  40. Stern, N. 2016. The Stern Report. Building on Success and Learning from Experience : An Independent Review of the Research Excellence Framework. IND 16/9. London: Department of Business, Energy and Industrial Strategy. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/541338/ind-16-9-ref-stern-review.pdf.
  41. Warry, P. 2006. The Warry Report: Increasing the Economic Impact of Research Councils Advice to the Director General of Science and Innovation, DTI from the Research Council Economic Impact Group. 06/1678. London: Department of Trade and Industry. http://webarchive.nationalarchives.gov.uk/20070628230000/http://www.dti.gov.uk/files/file32802.pdf.
  42. Watermeyer, R., and J. Lewis. 2018. “Institutionalizing Public Engagement through Research in UK Universities: Perceptions, Predictions and Paradoxes Concerning the State of the Art.” Studies in Higher Education 43 (9): 1612–1624.  https://doi.org/10.1080/03075079.2016.1272566.
  43. Williams, K., and J. Grant. 2018. “A Comparative Review of How the Policy and Procedures to Assess Research Impact Evolved in Australia and the UK.” Research Evaluation 27 (2): 93–105.  https://doi.org/10.1093/reseval/rvx042.
  44. Wilsdon, J., L. Allen, E. Belfiore, P. Campbell, S. Curry, S. Hill, R. Jones, et al. 2015. “The Metric Tide. Report of the Independent Review of the Role of Metrics in Research Assessment and Management.” HEFCE. http://www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/Independentresearch/2015/The,Metric,Tide/2015_metric_tide.pdf.

Copyright information

© The Author(s) 2019

Authors and Affiliations

  1. 1.Department of PsychologyManchester Metropolitan UniversityManchesterUK
  2. 2.Edge Hill UniversityOrmskirkUK

Personalised recommendations