, Volume 115, Issue 2, pp 1107–1113 | Cite as

Multiple versions of the h-index: cautionary use for formal academic purposes

  • Jaime A. Teixeira da Silva
  • Judit Dobránszki


The h-index, or Hirsch index, named after Jorge E. Hirsch, is one of the few author-based metrics currently available that offers a perspective of the productivity and citation impact of a scientist, researcher, or academic. There are four tools most commonly used to calculate the h-index, all of which depend on separate databases: Scopus, Web of Knowledge, Google Scholar, and ResearchGate. Using the h-index of the authors of this paper derived from these four sources, it is abundantly clear that scores vary widely and that it is unclear which of these sources is a reliable or accurate source of information, for any purpose. As the use and application of author-based metrics increases, including for official academic purposes, it is becoming increasingly important to know which source of the h-index is most accurate, and thus valid. Although this is not a review of the h-index, some perspectives are provided of the h-index-related literature to place this case study within a wider context of the weaknesses and criticisms of using the h-index as a metric to evaluate scientific outcome.


Accuracy Author-based metrics Creditability Databases Google Scholar Scopus Web of Science 



Author-based metric


Google Scholar


Hirsch index


Scientist, researcher, or academic


Web of Science


Compliance with ethical standards

Conflict of interest

The authors declared that they have no conflict of interest.


  1. Adriaansee, S. L., & Rensleigh, C. (2013). Web of Science, Scopus and Google Scholar: A content comprehensiveness comparison. The Electronic Library, 31(6), 727–744. Scholar
  2. Bar-Ilan, J. (2008). Which h-index?—A comparison of WoS, Scopus and Google Scholar. Scientometrics, 74(2), 257–271. Scholar
  3. Bornmann, L., & Daniel, H.-D. (2009). The state of h index research. Is the h index the ideal way to measure research performance? EMBO Reports, 10(1), 2–6. Scholar
  4. Costas, R., & Bordons, M. (2007). The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level. Journal of Infometrics, 1, 193–203. Scholar
  5. Flatt, J. W., Blassime, A., & Vayena, E. (2017). Improving the measurement of scientific success by reporting a self-citation index. Publications, 5, 20. Scholar
  6. Glänzel, W., & Persson, O. (2005). H-index for price medallists. ISSI Newsletter, 1(4), 15–18.Google Scholar
  7. Halevi, G., Moed, H., & Bar-Ilan, J. (2017). Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation—Review of the literature. Journal of Infometrics, 11(3), 823–834. Scholar
  8. Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences USA, 102(46), 16569–16572. Scholar
  9. Lippi, G., & Mattiuzzi, C. (2017). Scientist impact factor (SIF): A new metric for improving scientists’ evaluation? Annals of Translational Medicine, 5(15), 303. Scholar
  10. Popova, O., Romanov, D., Drozdov, A., & Gerashchenko, A. (2017). Citation-based criteria of the significance of the research activity of scientific teams. Scientometrics, 112(3), 1179–1202. Scholar
  11. Saraykar, S., Saleh, A., & Selek, S. (2017). The association between NIMH funding and h-index in psychiatry. Academic Psychiatry, 41, 455–459. Scholar
  12. Svider, P. F., Husain, Q., Folbe, A. J., Couldwell, W. T., Liu, J. K., & Eloy, J. A. (2014). Assessing national institutes of health funding and scholarly impact in neurological surgery. Journal of Neurosurgery, 120(1), 191–196. Scholar
  13. Teixeira da Silva, J. A. (2013). The global science factor v. 1.1: A new system for measuring and quantifying quality in science. The Asian and Australasian Journal of Plant Science and Biotechnology, 7(Special Issue 1), 92–101.Google Scholar
  14. Teixeira da Silva, J. A. (2016). Science watchdogs. Academic Journal of Interdisciplinary Studies, 5(3), 13–15. Scholar
  15. Teixeira da Silva, J. A. (2017). The journal impact factor (JIF): Science publishing’s miscalculating metric. Academic Questions, 30(4), 433–441.CrossRefGoogle Scholar
  16. Teixeira da Silva, J. A., & Bernès, S. (2017). Clarivate analytics: Continued omnia vanitas impact factor culture. Science and Engineering Ethics. (in press).Google Scholar
  17. Teixeira da Silva, J. A., & Bornemann-Cimenti, H. (2017). Why do some retracted papers continue to be cited? Scientometrics, 110(1), 365–370. Scholar
  18. Teixeira da Silva, J. A., & Dobránszki, J. (2017). Highly cited retracted papers. Scientometrics, 110(3), 1653–1661. Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2018

Authors and Affiliations

  1. 1.Miki-cho, TakamatsuJapan
  2. 2.Research Institute of Nyíregyháza, IAREFUniversity of DebrecenNyíregyházaHungary

Personalised recommendations