Advertisement

Scientometrics

, Volume 98, Issue 2, pp 777–795 | Cite as

Variability of research performance across disciplines within universities in non-competitive higher education systems

  • Giovanni Abramo
  • Ciriaco Andrea D’Angelo
  • Flavia Di Costa
Article

Abstract

Many nations are adopting higher education strategies that emphasize the development of elite universities able to compete at the international level in the attraction of skills and resources. Elite universities pursue excellence in all their disciplines and fields of action. The impression is that this does not occur in “non-competitive” education systems, and that instead, within single universities excellent disciplines will coexist with mediocre ones. To test this, the authors measure research productivity in the hard sciences for all Italian universities over the period 2004–2008 at the levels of the institution, their individual disciplines and fields within them. The results show that the distribution of excellent disciplines is not concentrated in a few universities: top universities show disciplines and fields that are often mediocre, while generally mediocre universities will often include top disciplines.

Keywords

Universities Competition Research productivity Research evaluation Bibliometrics Italy 

References

  1. Abramo, G., D’Angelo, C. A., & Pugini, F. (2008). The measurement of Italian universities’ research productivity by a non parametric-bibliometric methodology. Scientometrics, 76(2), 225–244.CrossRefGoogle Scholar
  2. Abramo, G., Cicero, T., & D’Angelo, C. A. (2011a). The dangers of performance-based research funding in non-competitive higher education systems. Scientometrics, 87(3), 641–654.CrossRefGoogle Scholar
  3. Abramo, G., Cicero, T., & D’Angelo, C. A. (2011b). A field-standardized application of DEA to national-scale research assessment of universities. Journal of Informetrics, 5(4), 618–628.CrossRefGoogle Scholar
  4. Abramo, G., Cicero, T., & D’Angelo, C. A. (2012). The dispersion of research performance within and between universities as a potential indicator of the competitive intensity in higher education systems. Journal of Informetrics, 6(2), 155–168.CrossRefGoogle Scholar
  5. Adams, J. (1990). Fundamental stocks of knowledge and productivity growth. Journal of Political Economy, 98(4), 673–702.CrossRefGoogle Scholar
  6. Aghion, P., Dewatripont, M., Hoxby, C., Mas-Colell, A., & Sapir, A. (2009). The governance and performance of research universities: Evidence from Europe and the U.S., NBER Working Paper No. 14851.Google Scholar
  7. Archambault, É., Campbell, D., Gingras, Y., & Larivière, V. (2009). Comparing bibliometric statistics obtained from the Web of Science and Scopus. Journal of the American Society for Information Science and Technology, 60(7), 1320–1326.CrossRefGoogle Scholar
  8. Auranen, O., & Nieminen, M. (2010). University research funding and publication performance: An international comparison. Research Policy, 39(6), 822–834.CrossRefGoogle Scholar
  9. Bonaccorsi, A., & Daraio, C. (2003). A robust nonparametric approach to the analysis of scientific productivity. Research Evaluation, 12(1), 47–69.CrossRefGoogle Scholar
  10. Bonaccorsi, A., Daraio, C., & Simar, L. (2006). Advanced indicators of productivity of universities. An application of robust nonparametric methods to Italian data. Scientometrics, 66(2), 389–410.CrossRefGoogle Scholar
  11. Butler, L. (2007). Assessing university research: A plea for a balanced approach. Science and Public Policy, 34(8), 565–574.CrossRefGoogle Scholar
  12. D’Angelo, C. A., Giuffrida, C., & Abramo, G. (2011). A heuristic approach to author name disambiguation in large-scale bibliometric databases. Journal of the American Society for Information Science and Technology, 62(2), 257–269.CrossRefGoogle Scholar
  13. Deem, R., Mok, K. H., & Lucas, L. (2008). Transforming higher education in whose image? Exploring the concept of the ‘World-Class’ University in Europe and Asia. Higher Education Policy, 21(1), 83–97.CrossRefGoogle Scholar
  14. EU Commission. (2003). The role of the universities in the Europe of knowledge. COMM 58 final.Google Scholar
  15. Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359–375.CrossRefGoogle Scholar
  16. Griliches, Z. (1998). R&D and productivity. Chicago: Chicago University Press.CrossRefGoogle Scholar
  17. Henderson, R., Jaffe, A., & Trajtenberg, M. (1998). Universities as a source of commercial technology: a detailed analysis of university patenting, 1965–1988. Review of Economics and Statistics, 80(1), 119–127.CrossRefGoogle Scholar
  18. Mansfield, E. (1995). Academic research underlying industrial innovations: Sources, characteristics, and financing. Review of Economics and Statistics, 77(1), 55–65.CrossRefMathSciNetGoogle Scholar
  19. Moed, H.F. (2005). Citation analysis in research evaluation. Springer, Berlin, ISBN: 978-1-4020-3713-9.Google Scholar
  20. Moed, H. F., Burger, W. J. M., Frankfort, J. G., & Van Raan, A. F. J. (1985). The application of bibliometric indicators: Important field- and time-dependent factors to be considered. Scientometrics, 8(3–4), 177–203.CrossRefGoogle Scholar
  21. Mohrman, K., Ma, W., & Bakerc, D. (2008). The Research University in transition: The emerging global model. Higher Education Policy, 21(1), 5–27.CrossRefGoogle Scholar
  22. OECD. (2007). Governance and quality guidelines in higher education. Paris: OECD.Google Scholar
  23. Pontille, D. (2004). La Signature Scientifique: Une Sociologie Pragmatique de l’Attribution. Paris: CNRS ÉDITIONS. 2004.Google Scholar
  24. RIN (Research Information Network) (2009). Communicating knowledge: How and why researchers publish and disseminate their findings. London, UK: RIN. Retrieved May 15, 2013 from: http://www.jisc.ac.uk/publications/research/2009/communicatingknowledgereport.aspx.
  25. Rosenberg, N., & Nelson, R. (1994). American universities and technical advance in industry. Research Policy, 23(3), 323–348.CrossRefGoogle Scholar
  26. Salmi, J. (2009). The challenge of establishing world-class universities. World Bank, Washington, DC. ISBN: 978-0-8213-7876-2.Google Scholar
  27. van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.CrossRefGoogle Scholar
  28. Veugelers, R., & van der Ploeg, F. (2008). Reforming European universities: Scope for an evidence-based process. In M. Dewatripont & F. Thys-Clement (Eds.), Governance of European Universities. Brussels: Editions de l’University de Bruxelles.Google Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2013

Authors and Affiliations

  • Giovanni Abramo
    • 1
  • Ciriaco Andrea D’Angelo
    • 1
    • 2
  • Flavia Di Costa
    • 1
    • 3
  1. 1.Laboratory for Studies of Research and Technology Transfer, Institute for System Analysis and Computer Science (IASI-CNR)National Research Council of ItalyRomeItaly
  2. 2.Department of Engineering and ManagementUniversity of Rome “Tor Vergata”RomeItaly
  3. 3.Research Value S.r.lRomeItaly

Personalised recommendations