Advertisement

Research Excellence in the Era of Online Attention: Almetrics of South Africa’s Highly Cited Papers in Selected Research Fields

  • Omwoyo Bosire OnyanchaEmail author
Article
  • 2 Downloads

Abstract

The use of highly cited papers (HCPs) to assess research excellence (RE) is widespread but the emergence of alternative metrics for assessing research impact has ignited debates on their role in measuring RE. This paper undertakes an altmetrics analysis of South Africa’s highly cited papers with a view to assessing the extent of online attention to these papers; the correlation between citation and altmetrics data and nuances in research fields, using conventional and altmetrics data. The Web of Science’s (WoS) Incites Essential Science Indicator (ESI) database was used to extract the metrics of HCPs while the online portal altmetric.com was used to obtain the altmetrics data for each HCP. The Pearson correlation analysis was used to determine the relationship between citation data and altmetrics of the HCPs. The results indicate that clinical medicine posted the highest number of HCPs but was behind Space Science and Molecular Biology & Genetics in terms of citations per paper. Only six fields recorded more than 100 HCPs. Overall, South Africa’s HCPs have the highest altmetrics presence in Dimensions followed by Mendeley, Twitter, Facebook, News Outlets and Blogs. However, each research field exhibited different patterns as far as online attention of their papers is concerned. The implications of the plausibility of using altmetrics in assessing RE in South Africa are offered.

Keywords

Research excellence Research quality Research impact South Africa Highly cited papers Essential science indicators Citations Altmetrics 

Notes

References

  1. 1.
    Aksnes DW. Characteristics of highly cited papers. Res Eval. 2003;12(3):159–70.CrossRefGoogle Scholar
  2. 2.
    Bar-Ilan J, et al. Beyond citations: scholars’ visibility on the social Web. 2012. https://arxiv.org/pdf/1205.5611.pdf. Accessed 28 July 2019.
  3. 3.
    Barnes C. The use of altmetrics as a tool for measuring research impact. Aust Acad Res Libr. 2015;46(2):121–34.  https://doi.org/10.1080/00048623.2014.1003174.CrossRefGoogle Scholar
  4. 4.
    Bornmann L. Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. J Informetr. 2014;8:895–903.CrossRefGoogle Scholar
  5. 5.
    Bornmann L, Leydesdorff L. Count highly-cited papers instead of papers with h citations: use normalized citation counts and compare “like with like”. Scientometrics. 2018;115:1119–23.CrossRefGoogle Scholar
  6. 6.
    Bornmann L, Wagner C, Leydesdorff L. BRICS countries and scientific excellence: a bibliometric analysis of most frequently cited papers. J Assoc Inf Sci Technol. 2015;66(7):1507–13.CrossRefGoogle Scholar
  7. 7.
    Bornmann L, Wohlrabe K, Anegon FM. Calculating the excellence shift: how efficiently do institutions produce highly cited papers? Scientometrics. 2017;112:1859–64.CrossRefGoogle Scholar
  8. 8.
    Chuang K-Y, Wang M-H, Ho Y-S. High-impact papers presented in the subject category of water resources in the essential science indicators database of the institute for scientific information. Scientometrics. 2011;87(3):551–62.CrossRefGoogle Scholar
  9. 9.
    Costas R, Zahedi Z, Wouters P. Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. J Assoc Inf Sci Technol. 2015;66(10):2003–19.CrossRefGoogle Scholar
  10. 10.
    De Winter JCF. The relationship between tweets, citations, and article views for PLOS One articles. Scientometrics. 2015;102:1773–9.CrossRefGoogle Scholar
  11. 11.
    Eysenbach G. Can Tweets presdict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact. J Med Internet Res. 2011;13(4):1–20. https://www.jmir.org/2011/4/e123/pdf. Accessed 7 July 2019.
  12. 12.
    Hicks D, Wouters P. The Leiden manifesto for research metrics. Nature. 2015;520:429–31.CrossRefGoogle Scholar
  13. 13.
    Hook DW, Porter SJ, Herzog C. Dimensions: building context for search and evaluation. Front Res Metr Anal. 2018.  https://doi.org/10.3389/frma.2018.00023.Google Scholar
  14. 14.
    Kousha K, Thelwall M. Google book search: citation analysis for social science and humanities. J Am Soc Inf Sci Technol. 2009;60(8):1537–49.CrossRefGoogle Scholar
  15. 15.
    Lăzăroiu G. What do altmetrics measure? Maybe the broader impact of research on society. Educ Philos Theory. 2017;49(4):309–11.  https://doi.org/10.1080/00131857.2016.1237735.CrossRefGoogle Scholar
  16. 16.
    Liao CH. How to improve research quality? Examining the impacts of cocllaboration intensity and member diversity in collaboration networks. Scientometrics. 2011;86:747–61.CrossRefGoogle Scholar
  17. 17.
    Liu CL, Xu YQ, Wu H, Chen SS, Guo JJ. Correlation and interaction visualization of altmetric indicators extracted from scholarly social network activities: dimensions and structure. J Med Internet Res. 2013;15(11):e259. http://www.jmir.org/2013/11/e259/. Accessed 26 July 2019.
  18. 18.
    Malero R. Altmetrics—a complement to conventional metrics. Biochemia Medica. 2015;25(2):152–60.CrossRefGoogle Scholar
  19. 19.
    Noorhidawati A, Aspura MKYI, Zahila MN, Abrizah A. Characteristics of Malaysian higly cited papers. Malays J Libr Inf Sci. 2017;22(2):85–99.Google Scholar
  20. 20.
    Onyancha OB. Towards a knowledge specialization index for sub-Saharan Africa: an informetrics study. J Knowl Econ. 2018.  https://doi.org/10.1007/s13132-018-0548-7.Google Scholar
  21. 21.
    Ortega JL. Relationship between altmetric and bibliometric indicators across academic social sites: the case of CSIC’s members. Scientometrics. 2015;9:39–49.Google Scholar
  22. 22.
    Priem J, Groth P, Taraborelli D. The altmetrics collection. PLoS ONE. 2012;7(11):48753.  https://doi.org/10.1371/journal.pone.0048753.CrossRefGoogle Scholar
  23. 23.
    Republic of South Africa. Department of Higher education and Training [DHET]. 2015 Research outputs policy. Government Gazette, vol 597, no. 38552. Pretoria: Republic of South Africa.Google Scholar
  24. 24.
    Ringelhan S, Wollersheim J, Welpe IM. I like, I cite? Do Facebook likes predict the impact of scientific work? PLoS ONE. 2015;10(8):e0134389.  https://doi.org/10.1371/journal.pone.0134389.CrossRefGoogle Scholar
  25. 25.
    Rodríguez-Navarro A. Measuring research excellence: number of Nobel Prize achievements versus conventional bibliometric indicators. J Doc. 2011;67(4):582–600.CrossRefGoogle Scholar
  26. 26.
    Snyder JK, McLaughlin GW, Montgomery JR. Factors contributing to research excellence. Res High Educ. 1991;32(1):45–58.CrossRefGoogle Scholar
  27. 27.
    Sörensen MP, Bloch C, Young M. Excellence in the knowledge-based economy: from scientific to research excellence. Eur J High Educ. 2016;6(3):217–36.CrossRefGoogle Scholar
  28. 28.
    Thelwall M. Why do papers have many Mendeley readers but few Scopus-indexed citations and vice versa? J Librariansh Inf Sci. 2017;49(2):144–51.CrossRefGoogle Scholar
  29. 29.
    Thelwall M, Haustein S, Larivière V, Sugimoto CR. Do altmetrics work? Twitter and ten other social web services. PLoS ONE. 2013;8(5):e64841.CrossRefGoogle Scholar
  30. 30.
    Thelwall M, Nevill T. Could scientists use Altmetric.com scores to predict longer term citation counts? J Informetr. 2018;12:237–48.CrossRefGoogle Scholar
  31. 31.
    Tijssen RJW. Scoreboards of research excellence. Res Eval. 2003;12(2):91–103.CrossRefGoogle Scholar
  32. 32.
    Tijssen RJW, Kraemer-Mbula E. Research excellence in Africa: policies, perceptons, and performance. Sci Public Policy. 2017;45(3):392–403.  https://doi.org/10.1093/scipol/scx074.CrossRefGoogle Scholar
  33. 33.
    Tijssen RJW, Visser MS, van Leeuwen TN. Benchmarking international scientific excellence: are highly cited research papers an appropriate frame of reference? Scientometrics. 2002;54(3):381–97.CrossRefGoogle Scholar
  34. 34.
    Wilsdon J, et al. The metric tide: report of the independent review of the role of metrics in research assessment and management. London: Higher Education Funding Council for England; 2015.  https://doi.org/10.13140/RG.2.1.4929.1363.CrossRefGoogle Scholar
  35. 35.
    Zhigang H, Wencan T, Shenmeng X, Xianwen W, Liang L, Chunbo Z. Why ESI is unreliable in selecting highly cited papers. Paper presented at the 23rd international conference on science and technology indicators (STI 2018). 2018. https://openaccess.leidenuniv.nl/bitstream/handle/1887/65238/STI2018_paper_23.pdf?sequence=1. Accessed 26 July 2019.

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Information ScienceUniversity of South Africa (Unisa)PretoriaSouth Africa

Personalised recommendations