Higher Education

, Volume 57, Issue 4, pp 393–404 | Cite as

Evolving regimes of multi-university research evaluation

  • Diana Hicks


Since 1980, national university departmental ranking exercises have developed in several countries. This paper reviews exercises in the U.S., U.K. and Australia to assess the state-of-the-art and to identify common themes and trends. The findings are that the exercises are becoming more elaborate, even unwieldy, and that there is some retreat from complexity. There seems to be a movement towards bibliometric measures. The exercises also seem to be effective in enhancing university focus on research strategy.


Composite index ERA NRC ranking RAE Ranking Research RQF University 



Consultations with Ben R. Martin, Linda Butler and Phil Shapira were most helpful in the writing of this paper and the author is grateful for their insights. There remains a distinct possibility that misinterpretations remain however, and for those the author is solely responsible.


  1. Australian Bureau of Statistics. (2006). Research and experimental development: higher education organisations, 2004 Reissue, 8111.0, July,
  2. Australian Government, Department of Education, Science and Training. (2006). Research quality framework: Assessing the quality and impact of research in Australia, The Recommended RQF. Commonwealth of Australia: October.Google Scholar
  3. Australian Vice Chancellors Committee (AVCC). (2005). University Funding and Expenditure, January,
  4. Butler, L. (2003). Explaining Australia’s increased share of ISI publications—the effects of a funding formula based on publication counts. Research Policy, 32, 143–155. doi: 10.1016/S0048-7333(02)00007-0.CrossRefGoogle Scholar
  5. Carr, K. (2008). New era for research quality: Announcement of Excellence in Research for Australia Initiative. Press release, February 26, 2008. Retrieved March 6, 2008 from:
  6. Department of Trade and Industry (DTI), Office of Science and Innovation. (2007). PSA Target Metrics 2006. London: HMSO.Google Scholar
  7. Dusansky, R., & Vernon, C. J. (1998). Rankings of U.S. Economics Departments. The Journal of Economic Perspectives, 12(1), 157–170.Google Scholar
  8. Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41, 277–304. doi: 10.1023/ Scholar
  9. Gläser, J., Laudel, G., Hinze, S., & Butler, L. (2002). Impact of evaluation-based funding on the production of scientific knowledge: What to worry about and how to find out. Expertise for the German Ministry for Education and Research, May,
  10. Gläser, J., Spurling, T. S., & Butler, L. (2004). Intraorganisational evaluation: Are there “least evaluable units”? Research Evaluation, 13, 1. doi: 10.3152/147154404781776554.CrossRefGoogle Scholar
  11. Harman, G. (2000). Allocating research infrastructure grants in post-binary higher education systems: British and Australian approaches. Journal of Higher Education Policy and Management, 22(2), 111–126. doi: 10.1080/14636770307132.CrossRefGoogle Scholar
  12. Higher Education Research Opportunities in the United Kingdom (HERO). (2008). Accessed 12 June 2007.
  13. Jackman, R. W., & Siverson, R. M. (1996). Rating the ratings: An analysis of the National Research Council’s rating of political science PhD Programs. PS: Political Science & Politics, 29(2), 155–160. doi: 10.2307/420693.CrossRefGoogle Scholar
  14. Lipsett, A. (2005). RAE raises UK journal activity. Times Higher Education Supplement, July 1, section 1698, 4.Google Scholar
  15. Lipsett, A. (2007). RAE selection gets brutal. Times Higher Education Supplement, February 2, section 1779, 1.Google Scholar
  16. Marginson, S. (1997). Steering from a distance: Power relations in Australian higher education. Higher Education, 34(1), 63–80. doi: 10.1023/A:1003082922199.CrossRefGoogle Scholar
  17. Martin, B. R. (2007, January). Replacing the RAE with Metrics—Is this the way forward? Seminar presented at Imperial College.Google Scholar
  18. Miller, A. H., Tien, C., & Peebler, A. A. (1996). Department rankings: An alternative approach. PS: Political Science & Politics, 29(4), 704–717. doi: 10.2307/420798.CrossRefGoogle Scholar
  19. Moed, H. F. (2007). UK Research assessment exercises: Informed judgments on research quality or quantity? Scientometrics, Forthcoming.Google Scholar
  20. National Research Council. (2004). Assessing research doctorate programs: A methodology study. Washington DC: National Academies Press.
  21. Sanders, C. (2006). Boom time for high-flyers. Times Higher Education Supplement, March 17, section 1734, 6.Google Scholar
  22. Sastry, T., & Bekhradnia, B. (2006). Using metrics to allocate research funds: A short evaluation of alternatives to the Research Assessment Exercise. Oxford: Higher Education Policy Institute.Google Scholar
  23. Segal, Quince Wickstead (SQW). (1996). A study of selectivity. Bristol: HEFCE, M 20/96.
  24. U.K. Department of Trade and Industry (DTI). (2007). Science, engineering and technology statistics, Table 5.1. Higher Education Funding Council for England (HEFCE). (1997). The impact of the 1992 Research Assessment Exercise on higher education institutions in England. Bristol: Higher Education Funding Council for England, M6/97.
  25. Wojtas, O. (2007). RAE fuels trend for research only time. Times Higher Education Supplement, January 19, section 1777, 64.Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2008

Authors and Affiliations

  1. 1.School of Public PolicyGeorgia Institute of TechnologyAtlantaUSA

Personalised recommendations