A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar
Given the current availability of different bibliometric indicators and of production and citation data sources, the following two questions immediately arise: do the indicators’ scores differ when computed on different data sources? More importantly, do the indicator-based rankings significantly change when computed on different data sources? We provide a case study for computer science scholars and journals evaluated on Web of Science and Google Scholar databases. The study concludes that Google scholar computes significantly higher indicators’ scores than Web of Science. Nevertheless, citation-based rankings of both scholars and journals do not significantly change when compiled on the two data sources, while rankings based on the h index show a moderate degree of variation.
KeywordsBibliometric indicators h Index Publication and citation data sources Correlation analysis
- Bakkalbasi, N., Bauer, K., Glover, J., & Wang, L. (2006). Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomedical Digital Libraries, 7. Retrieved December 20, 2008, from http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1533854.
- Bauer, K. & Bakkalbasi, N. (2005). An examination of citation counts in a new scholarly communication environment. D-Lib Magazine, 11(9). Retrieved December 20, 2008, from http://www.dlib.org/dlib/september05/bauer/09bauer.html.
- Bornmann, L., Marx, W., Schier, H., Rahm, E., Thor, A., & Daniel, H.-D. (2009). Convergent validity of bibliometric Google Scholar data in the field of chemistry citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published elsewhere, using Google Scholar, Science Citation Index, Scopus, and Chemical Abstracts. Journal of Informetrics, 3(1), 27–35.CrossRefGoogle Scholar
- Choppy, C., van Leeuwen, J., Meyer, B., & Staunstrup, J. (2009). Research evaluation for computer science. Communications of the ACM, 54(4), 31–34.Google Scholar
- Computing Research Association (1999). Best practices memo—evaluating computer scientists and engineers for promotion and tenure. Computing Research News. Retrieved December 20, 2008, from http://www.cra.org/reports/tenure_review.html.
- Garfield, E. (1979). Citation indexing: Its history and applications in science, technology and humanities. New York: Wiley.Google Scholar
- Harold, E. R. & Means, W. S. (2004). XML in a nutshell (3rd ed.). Sebastopol: O’Reilly.Google Scholar
- Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Science of the United States of America, 102(46):16569–16572.Google Scholar
- Jacsò, P. (2005). As we may search. Comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current Science, 89(9):1537–1547. Retrieved December 20, 2008, from http://www.ias.ac.in/currsci/nov102005/1537.pdf.
- Katsaros, C., Manolopoulos, Y., and Sidiropoulos, A. (2006). Generalized h-index for disclosing latent facts in citation networks. Retrieved December 20, 2008, from http://arxiv.org/abs/cs.DL/0607066.
- Moore, D. (2006). Basic practice of statistics (4th ed.). New York: WH Freeman Company.Google Scholar
- Pauly, D. & Stergiou, K. I. (2005). Equivalence of results from two citation analyses: Thomson ISI citation index and Google scholar service. Ethics in Science and Environmental Politics, 33–35.Google Scholar
- R Development Core Team (2009). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0.Google Scholar