Do altmetrics work for assessing research quality?
Alternative metrics (aka altmetrics) are gaining increasing interest in the scientometrics community as they can capture both the volume and quality of attention that a research work receives online. Nevertheless, there is limited knowledge about their effectiveness as a mean for measuring the impact of research if compared to traditional citation-based indicators. This work aims at rigorously investigating if any correlation exists among indicators, either traditional (i.e. citation count and h-index) or alternative (i.e. altmetrics) and which of them may be effective for evaluating scholars. The study is based on the analysis of real data coming from the National Scientific Qualification procedure held in Italy by committees of peers on behalf of the Italian Ministry of Education, Universities and Research.
KeywordsAltmetrics Research quality Bibliometric indicators Correlation analysis
This research has been supported by the Italian National Agency for the Evaluation of the University and Research Systems (ANVUR) within the Measuring the Impact of Research - Alternative indicators (MIRA) project. Andrea Giovanni Nuzzolese is the main contributor of this paper and the principal investigator of the project that supported the research presented in this paper.
- Costas, R., Zahedi, Z., & Wouters, P. (2015). Do altmetrics correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology, 66(10), 2003–2019. https://doi.org/10.1002/asi.23309.CrossRefGoogle Scholar
- Ibáñez, A., Larrañaga, P., & Bielza, C. (2011). Predicting the h-index with cost-sensitive Näive Bayes. In 11th international conference on intelligent systems design and applications (ISDA) 2011, IEEE (pp. 599–604). IEEE. https://doi.org/10.1109/ISDA.2011.6121721
- Li, X. & Thelwall, M. (2012) F1000, mendeley and traditional bibliometric indicators. In Archambault, Y. G., & Lariviere, V. (eds.) The 17th international conference on science and technology indicators (pp. 541–551).Google Scholar
- Nuzzolese, A. G., Presutti, V., Gangemi, A. & Ciancarini, P. (2018) Extending ScholarlyData with research impact indicators. In Workshop on semantics, analytics, visualisation: Enhancing scholarly dissemination (SAVE-SD), Springer.Google Scholar
- Ortega, J. L. (2018) Reliability and accuracy of altmetric providers: A comparison among Altmetric.com, PlumX and Crossref Event Data. Scientometrics. https://doi.org/10.1007/s11192-018-2838-z
- Wouters, P., Thelwall, M., Kousha, K., Waltman, L., de Rijcke, S., Rushforth, A. & Franssen, T. (2015). The metric tide: Correlation analysis of REF2014 scores and metrics (Supplementary Report II to the Independent Review of the Role of Metrics in Research Assessment and Management). London: Higher Education Funding Council for England (HEFCE). https://doi.org/10.13140/RG.2.1.3362.4162