Skip to main content
Log in

Evolving regimes of multi-university research evaluation

  • Published:
Higher Education Aims and scope Submit manuscript

Abstract

Since 1980, national university departmental ranking exercises have developed in several countries. This paper reviews exercises in the U.S., U.K. and Australia to assess the state-of-the-art and to identify common themes and trends. The findings are that the exercises are becoming more elaborate, even unwieldy, and that there is some retreat from complexity. There seems to be a movement towards bibliometric measures. The exercises also seem to be effective in enhancing university focus on research strategy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Only research evaluation is considered here. The teaching and social or economic development missions of universities are not discussed.

  2. The National Research Council (NRC) functions under the auspices of the National Academy of Sciences (NAS), the National Academy of Engineering (NAE), and the Institute of Medicine (IOM). The NAS, NAE, IOM, and NRC are part of a private, nonprofit institution that provides science, technology and health policy advice under a congressional charter.

  3. Note the difference with the NRC exercise. The NRC will issue a rank ordering of departments. The RAE issues “grades” to each department.

  4. The portion of research funding based on the evaluation results was called the “research quantum” until 2001 and the “institutional grants scheme” thereafter. In 2004, the Institutional Grants Scheme accounted for AU $285 million of AU $4,283 in R&D funding in universities (HERD) (Australian Vice Chancellors Committee 2005, Table A. 1; Australian Bureau of Statistics 2006, p. 3).

  5. In the U.K. were found Ben Martin and John Irvine at SPRU, University of Sussex; in Australia Paul Bourke and Linda Butler at the Australian National University in Canberra; and in the U.S., Francis Narin of CHI Research and also ISI, now Thomson Scientific, provider of the Science Citation Index, the database most used for bibliometric analysis.

References

  • Australian Bureau of Statistics. (2006). Research and experimental development: higher education organisations, 2004 Reissue, 8111.0, July, http://www.ausstats.abs.gov.au/ausstats/.

  • Australian Government, Department of Education, Science and Training. (2006). Research quality framework: Assessing the quality and impact of research in Australia, The Recommended RQF. Commonwealth of Australia: October.

  • Australian Vice Chancellors Committee (AVCC). (2005). University Funding and Expenditure, January, http://www.universitiesaustralia.edu.au/documents/publications/stats/Funding&Expenditure.pdf.

  • Butler, L. (2003). Explaining Australia’s increased share of ISI publications—the effects of a funding formula based on publication counts. Research Policy, 32, 143–155. doi:10.1016/S0048-7333(02)00007-0.

    Article  Google Scholar 

  • Carr, K. (2008). New era for research quality: Announcement of Excellence in Research for Australia Initiative. Press release, February 26, 2008. Retrieved March 6, 2008 from: http://minister.industry.gov.au/SenatortheHonKimCarr/Pages/NEWERAFORRESEARCHQUALITY.aspx.

  • Department of Trade and Industry (DTI), Office of Science and Innovation. (2007). PSA Target Metrics 2006. London: HMSO.

  • Dusansky, R., & Vernon, C. J. (1998). Rankings of U.S. Economics Departments. The Journal of Economic Perspectives, 12(1), 157–170.

    Google Scholar 

  • Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41, 277–304. doi:10.1023/B:MINE.0000005155.70870.bd.

    Article  Google Scholar 

  • Gläser, J., Laudel, G., Hinze, S., & Butler, L. (2002). Impact of evaluation-based funding on the production of scientific knowledge: What to worry about and how to find out. Expertise for the German Ministry for Education and Research, May, http://repp.anu.edu.au/expertise-glae-lau-hin-but.pdf.

  • Gläser, J., Spurling, T. S., & Butler, L. (2004). Intraorganisational evaluation: Are there “least evaluable units”? Research Evaluation, 13, 1. doi:10.3152/147154404781776554.

    Article  Google Scholar 

  • Harman, G. (2000). Allocating research infrastructure grants in post-binary higher education systems: British and Australian approaches. Journal of Higher Education Policy and Management, 22(2), 111–126. doi:10.1080/14636770307132.

    Article  Google Scholar 

  • Higher Education Research Opportunities in the United Kingdom (HERO). (2008). http://www.hero.ac.uk/uk/research/research_assessment_exercise_.cfm. Accessed 12 June 2007.

  • Jackman, R. W., & Siverson, R. M. (1996). Rating the ratings: An analysis of the National Research Council’s rating of political science PhD Programs. PS: Political Science & Politics, 29(2), 155–160. doi:10.2307/420693.

    Article  Google Scholar 

  • Lipsett, A. (2005). RAE raises UK journal activity. Times Higher Education Supplement, July 1, section 1698, 4.

  • Lipsett, A. (2007). RAE selection gets brutal. Times Higher Education Supplement, February 2, section 1779, 1.

  • Marginson, S. (1997). Steering from a distance: Power relations in Australian higher education. Higher Education, 34(1), 63–80. doi:10.1023/A:1003082922199.

    Article  Google Scholar 

  • Martin, B. R. (2007, January). Replacing the RAE with Metrics—Is this the way forward? Seminar presented at Imperial College.

  • Miller, A. H., Tien, C., & Peebler, A. A. (1996). Department rankings: An alternative approach. PS: Political Science & Politics, 29(4), 704–717. doi:10.2307/420798.

    Article  Google Scholar 

  • Moed, H. F. (2007). UK Research assessment exercises: Informed judgments on research quality or quantity? Scientometrics, Forthcoming.

  • National Research Council. (2004). Assessing research doctorate programs: A methodology study. Washington DC: National Academies Press. http://www.nap.edu/openbook/030909058X/html/28.html.

  • Sanders, C. (2006). Boom time for high-flyers. Times Higher Education Supplement, March 17, section 1734, 6.

  • Sastry, T., & Bekhradnia, B. (2006). Using metrics to allocate research funds: A short evaluation of alternatives to the Research Assessment Exercise. Oxford: Higher Education Policy Institute.

    Google Scholar 

  • Segal, Quince Wickstead (SQW). (1996). A study of selectivity. Bristol: HEFCE, M 20/96. http://www.hefce.ac.uk/pubs/hefce/1996/m20_96_2.htm.

  • U.K. Department of Trade and Industry (DTI). (2007). Science, engineering and technology statistics, Table 5.1. Higher Education Funding Council for England (HEFCE). (1997). The impact of the 1992 Research Assessment Exercise on higher education institutions in England. Bristol: Higher Education Funding Council for England, M6/97. http://www.hefce.ac.uk/pubs/hefce/1997/m6_97.htm.

  • Wojtas, O. (2007). RAE fuels trend for research only time. Times Higher Education Supplement, January 19, section 1777, 64.

Download references

Acknowledgements

Consultations with Ben R. Martin, Linda Butler and Phil Shapira were most helpful in the writing of this paper and the author is grateful for their insights. There remains a distinct possibility that misinterpretations remain however, and for those the author is solely responsible.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Diana Hicks.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hicks, D. Evolving regimes of multi-university research evaluation. High Educ 57, 393–404 (2009). https://doi.org/10.1007/s10734-008-9154-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10734-008-9154-0

Keywords

Navigation