Skip to main content

Field Structuration and Fragmentation of Global Rankings

  • Chapter
  • First Online:
Rankings and Global Knowledge Governance

Part of the book series: Palgrave Studies in Global Higher Education ((PSGHE))

  • 527 Accesses

Abstract

In this chapter, we look closely at the fragmentation of rankings and indicators relevant to knowledge governance in higher education, economic competitiveness, innovation, and good governance that has challenged the established producers of numeric knowledge. Not only have the amount of international datasets multiplied, but the varieties of measurement—concerning conceptual and methodological decisions—have also increased. We find that the process of fragmentation has not effectively challenged the ideas behind the figures. Instead, the emerging indicator sets are woven into the fabric of the existing measurements as the figures that enter the field largely build on the existing ones without fundamentally challenging their ideational premises. This further embeds the use of numerical assessment in transnational governance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Notes

  1. 1.

    The World Bank has used Country Policy and Institutional Assessment (CPIA) tool since mid-1970s for assessing the eligibility for funding.

  2. 2.

    Also the use of the good governance indicators has drawn interest, most notably with regards to development funding (Hammergren 2011; Knoll and Zloczysti 2012; Saisana and Saltelli 2011). Here, the indicators such as the Worldwide Governance Indicators are seen instrumental for development aid, while also attracting attention on the local level (Morgan 2011; Stubbs 2009). While the World Bank has not used the WGI in its allocation of funding, the index has obtained such uses. The most prominent user of the governance indices in development funding has been the US government through its Millennium Challenge Corporation (MCC) that was established in 2004.

  3. 3.

    Most notably, the Millennium Challenge Corporation uses Fringe Special (and Open Net Initiative) for its financing criteria since 2012, having previously used WGI Voice and Accountability data.

  4. 4.

    The indices explicitly criticized were the World Bank Governance Indicators, the European Central Bank’s Public Sector Efficiency Study, the World Economic Forum’s Public Institutions Index in the Global Competitiveness Report, and the “Government Efficiency” Indicator developed by the International Institute for Management Development in the World Competitiveness Yearbook.

  5. 5.

    “There is a significant growth in broad measures of “governance”, including some comparative data concerning public sector bureaucratic quality. However, most of these data are based on subjective assessments, and were not initially collected with comparative analysis of public management as a principal aim. […] Reviews of these data note that these indicators incorporate significant methodological problems. The data often do not adequately measure what they claim to measure, and can aggregate many diverse indicators, achieving statistical quality at the price of significant loss of conceptual precision. Often data amount to broad subjective evaluations combined with service-specific performance indicators. The former can be excessively impressionistic and the latter cannot be aggregated in any meaningful way” (OECD 2005, 6).

  6. 6.

    “The absence of a well-accepted theoretical framework for governance ensures that any composite indicators are largely devices for communication—for crystallizing concerns about corruption etc. into a single short and pithy summary” (OECD 2006c, 60).

  7. 7.

    “More generally, recognizing the importance of margins of error and the imprecision of country rankings, we do not follow the popular practice of producing precisely ranked ‘top ten’ or ‘bottom ten’ lists of countries according to their performance on the WGI, recognizing that such seemingly precise ‘horse races’ are of dubious relevance and reliability” (Kaufmannn et al. 2008, 5).

  8. 8.

    “In 2003 after the publication of the Shanghai Jiatong University breakthrough ranking, the Academic Ranking of World Universities (ARWU), we decided to adopt the main innovations proposed by Liu and his team. The ranking will be built from publicly available web data, combining the variables into a composite indicator, and with a true global coverage. The first edition was published in 2004, it appears twice per year since 2006 and after 2008 the portal also includes webometrics rankings for research centers, hospitals, repositories and business schools.” Webometrics Methodology, [http://www.webometrics.info/en/Methodology].

  9. 9.

    2017. Ranking Web of Universities. January New Edition, [http://www.webometrics.info/en/node/178], date accessed 30 June 2017.

  10. 10.

    http://www.educationalpolicy.org/pdf/global2005.pdf, date accessed 28 February 2013.

  11. 11.

    CWTS Leiden Ranking 2017 Indicators, [http://www.leidenranking.com/information/indicators], date accessed 30 June 2017.

  12. 12.

    SIR Methodology, http://www.scimagoir.com/methodology.php, date accessed 30 June 2017.

  13. 13.

    Transparent Ranking: Top Universities by Google Scholar Citations [http://www.webometrics.info/en/node/169], date accessed 30 June 2017.

  14. 14.

    https://www.topuniversities.com/qs-world-university-rankings/methodology, date accessed 30 June 2017.

  15. 15.

    https://www.timeshighereducation.com/world-university-rankings/methodology-world-university-rankings-2016-2017, date accessed 30 June 2017.

  16. 16.

    Webometrics Methodology, [http://www.webometrics.info/en/Methodology].

  17. 17.

    THE World University Rankings 2016–2017 methodology [https://www.timeshighereducation.com/world-university-rankings/methodology-world-university-rankings-2016-2017].

  18. 18.

    SCImago Institutions Ranking methodology, http://www.scimagoir.com/methodology.php.

  19. 19.

    Centrum für Hochschulentwicklung, Berlin Principles on Ranking of Higher Education Institutions [https://www.che.de/downloads/Berlin_Principles_IREG_534.pdf].

  20. 20.

    Testing Student and University Performance Globally: OECD’s AHELO—OECD, [http://www.oecd.org/edu/skills-beyond-school/testingstudentanduniversityperformancegloballyoecdsahelo.htm].

  21. 21.

    http://www.oecd.org/.

  22. 22.

    http://www.u-multirank.eu/project/.

  23. 23.

    The CHERPA consisted of five partners: Centrum für Hochschulentwicklung (CHE, Germany), the Center for Higher Education Policy Studies at the University of Twente (Netherlands), the Centre for Science and Technology Studies (CWTS) at Leiden University (Netherlands), INCENTIM research division at the Catholic University of Leuven (Belgium), and the Observatoire des Sciences et des Techniques in Paris.

  24. 24.

    “U-Multirank—Education and Training—European Commission”. Education and Training. [https://ec.europa.eu/education/initiatives/u-multirank_en].

  25. 25.

    The WEF report names few concrete examples of perceived problems such as the methods for calculating physical sales of goods and services that do not consider virtual platforms and nonmonetary exchanges of services as well as measurement issues in GDP as an indicator of economic progress (World Economic Forum 2016, 51–52).

  26. 26.

    https://www.globalinnovationindex.org/userfiles/file/reportpdf/gii-full-report-2015-v6.pdf

  27. 27.

    https://www.bcgperspectives.com/content/articles/innovation_manufacturing_innovation_imperative_in_manufacturing/?chapter=3

  28. 28.

    http://www.themanufacturinginstitute.org/~/media/6731673D21A64259B081AC8E083AE091.ashx

  29. 29.

    http://www.bloomberg.com/graphics/2015-innovative-countries/, http://www.bloomberg.com/news/articles/2016-01-19/these-are-the-world-s-most-innovative-economies, https://www.bloomberg.com/news/articles/2017-01-17/sweden-gains-south-korea-reigns-as-world-s-most-innovative-economies

  30. 30.

    http://www.cleantech.com/wp-content/uploads/2014/08/Global_Cleantech_Innov_Index_2014.pdf

  31. 31.

    http://www2.itif.org/2016-contributors-and-detractors.pdf, http://www2.itif.org/2016-contributors-and-detractors-executive-summary.pdf?_ga=1.249958406.127216268.1464961189

  32. 32.

    The seven factors are R&D intensity, manufacturing value added, productivity, high-tech density, tertiary efficiency, researcher concentration, and patent activity.

  33. 33.

    http://www.cleantech.com/wp-content/uploads/2014/08/Global_Cleantech_Innov_Index_2014.pdf, page 3.

  34. 34.

    http://www.cleantech.com/wp-content/uploads/2014/08/Global_Cleantech_Innov_Index_2014.pdf, page 10.

  35. 35.

    Haas further identifies joint policy enterprise as criteria for epistemic community (Haas 1992). This might apply in the field of good governance, where the actors are often explicitly committed to governance reform. However, in the domain of university rankings the motivations for creating the figures are less clear.

References

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Erkkilä, T., Piironen, O. (2018). Field Structuration and Fragmentation of Global Rankings. In: Rankings and Global Knowledge Governance. Palgrave Studies in Global Higher Education. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-68941-8_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-68941-8_5

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-319-68940-1

  • Online ISBN: 978-3-319-68941-8

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics