, Volume 118, Issue 2, pp 539–562 | Cite as

Do altmetrics work for assessing research quality?

  • Andrea Giovanni NuzzoleseEmail author
  • Paolo Ciancarini
  • Aldo Gangemi
  • Silvio Peroni
  • Francesco Poggi
  • Valentina Presutti


Alternative metrics (aka altmetrics) are gaining increasing interest in the scientometrics community as they can capture both the volume and quality of attention that a research work receives online. Nevertheless, there is limited knowledge about their effectiveness as a mean for measuring the impact of research if compared to traditional citation-based indicators. This work aims at rigorously investigating if any correlation exists among indicators, either traditional (i.e. citation count and h-index) or alternative (i.e. altmetrics) and which of them may be effective for evaluating scholars. The study is based on the analysis of real data coming from the National Scientific Qualification procedure held in Italy by committees of peers on behalf of the Italian Ministry of Education, Universities and Research.


Altmetrics Research quality Bibliometric indicators Correlation analysis 



This research has been supported by the Italian National Agency for the Evaluation of the University and Research Systems (ANVUR) within the Measuring the Impact of Research - Alternative indicators (MIRA) project. Andrea Giovanni Nuzzolese is the main contributor of this paper and the principal investigator of the project that supported the research presented in this paper.


  1. Bar-Ilan, J. (2012). JASIST 2001–2010. Bulletin of the Association for Information Science and Technology, 38(6), 24–28. Scholar
  2. Bornmann, L. (2015). Alternative metrics in scientometrics: A meta-analysis of research into three altmetrics. Scientometrics, 103(3), 1123–1144. Scholar
  3. Bornmann, L., & Haunschild, R. (2018). Do altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime data. PloS One, 13(5), 0197133. Scholar
  4. Brody, T., Harnad, S., & Carr, L. (2006). Earlier Web usage statistics as predictors of later citation impact. Journal of the American Society for Information Science and Technology, 57(8), 1060–1072. Scholar
  5. Costas, R., Zahedi, Z., & Wouters, P. (2015). Do altmetrics correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology, 66(10), 2003–2019. Scholar
  6. de Beaver, D. B., & Rosen, R. (1978). Studies in scientific collaboration–Part I. The professional origins of scientific co-authorship. Scientometrics, 1(1), 65–84. Scholar
  7. Ibáñez, A., Larrañaga, P., & Bielza, C. (2011). Predicting the h-index with cost-sensitive Näive Bayes. In 11th international conference on intelligent systems design and applications (ISDA) 2011, IEEE (pp. 599–604). IEEE.
  8. Jensen, P., Rouquier, J.-B., & Croissant, Y. (2009). Testing bibliometric indicators by their prediction of scientists promotions. Scientometrics, 78(3), 467–479. Scholar
  9. Kousha, K., & Thelwall, M. (2007). Google Scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 58(7), 1055–1065. Scholar
  10. Kousha, K., & Thelwall, M. (2008). Assessing the impact of disciplinary research on teaching: An automatic analysis of online syllabuses. Journal of the Association for Information Science and Technology, 59(13), 2060–2069. Scholar
  11. Kousha, K., & Thelwall, M. (2009). Google book search: Citation analysis for social science and the humanities. Journal of the American Society for Information Science and Technology, 60(8), 1537–1549. Scholar
  12. Li, X. & Thelwall, M. (2012) F1000, mendeley and traditional bibliometric indicators. In Archambault, Y. G., & Lariviere, V. (eds.) The 17th international conference on science and technology indicators (pp. 541–551).Google Scholar
  13. Li, X., Thelwall, M., & Giustini, D. (2012). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471. Scholar
  14. Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125. Scholar
  15. Nuzzolese, A. G., Presutti, V., Gangemi, A. & Ciancarini, P. (2018) Extending ScholarlyData with research impact indicators. In Workshop on semantics, analytics, visualisation: Enhancing scholarly dissemination (SAVE-SD), Springer.Google Scholar
  16. Ortega, J. L. (2018) Reliability and accuracy of altmetric providers: A comparison among, PlumX and Crossref Event Data. Scientometrics.
  17. Peters, I., Jobmann, A., Hoffmann, C. P., Künne, S., Schmitz, J., & Wollnik-Korn, G. (2014). Altmetrics for large, multidisciplinary research groups: Comparison of current tools. Bibliometrie-praxis und forschung, 3, 1–19. Scholar
  18. Pinkowitz, L. (2002). Research dissemination and impact: Evidence from web site downloads. The Journal of Finance, 57(1), 485–499.CrossRefGoogle Scholar
  19. Priem, J., Groth, P., & Taraborelli, D. (2012). The altmetrics collection. PloS One, 7(11), 48753. Scholar
  20. Ravenscroft, J., Liakata, M., Clare, A., & Duma, D. (2017). Measuring scientific impact beyond academia: An assessment of existing impact metrics and proposed improvements. PloS One, 12(3), 0173152. Scholar
  21. Thelwall, M. (2018). Early Mendeley readers correlate with later citation counts. Scientometrics, 115(3), 1231–1240. Scholar
  22. Thelwall, M., & Kousha, K. (2008). Online presentations as a source of scientific impact? An analysis of PowerPoint files citing academic journals. Journal of the Association for Information Science and Technology, 59(5), 805–815. Scholar
  23. Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PloS One, 8(5), 64841. Scholar
  24. Vieira, E. S., Cabral, J. A., & Gomes, J. A. (2014). Definition of a model based on bibliometric indicators for assessing applicants to academic positions. Journal of the Association for Information Science and Technology, 65(3), 560–577. Scholar
  25. Wouters, P., Thelwall, M., Kousha, K., Waltman, L., de Rijcke, S., Rushforth, A. & Franssen, T. (2015). The metric tide: Correlation analysis of REF2014 scores and metrics (Supplementary Report II to the Independent Review of the Role of Metrics in Research Assessment and Management). London: Higher Education Funding Council for England (HEFCE).

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  1. 1.STLabISTC-CNRRomeItaly
  2. 2.DISIUniversity of BolognaBolognaItaly
  3. 3.FICLITUniversity of BolognaBolognaItaly

Personalised recommendations