Advertisement

Scientometrics

, Volume 121, Issue 1, pp 433–450 | Cite as

Universities through the eyes of bibliographic databases: a retroactive growth comparison of Google Scholar, Scopus and Web of Science

  • Enrique Orduna-Malea
  • Selenay AytacEmail author
  • Clara Y. Tran
Article

Abstract

The purpose of this study is to ascertain the suitability of GS’s url-based method as a valid approximation of universities’ academic output measures, taking into account three aspects (retroactive growth, correlation, and coverage). To do this, a set of 100 Turkish universities were selected as a case study. The productivity in Web of Science (WoS), Scopus and GS (2000–2013) were captured in two different measurement iterations (2014 and 2018). In addition, a total of 18,174 documents published by a subset of 14 research-focused universities were retrieved from WoS, verifying their presence in GS within the official university web domain. Findings suggest that the retroactive growth in GS is unpredictable and dependent on each university, making this parameter hard to evaluate at the institutional level. Otherwise, the correlation of productivity between GS (url-based method) and WoS and Scopus (selected sources) is moderately positive, even though it varies depending on the university, the year of publication, and the year of measurement. Finally, only 16% out of 18,174 articles analyzed were indexed in the official university website, although up to 84% were indexed in other GS sources. This work proves that the url-based method to calculate institutional productivity in GS is not a good proxy for the total number of publications indexed in WoS and Scopus, at least in the national context analyzed. However, the main reason is not directly related to the operation of GS, but with a lack of universities’ commitment to open access.

Keywords

Universities Google Scholar Bibliometrics Web of Science Scopus Academic search engines Research productivity Retroactive growth Bibliographic databases Turkey 

References

  1. Aguillo, I. F. (2012). Is Google Scholar useful for bibliometrics? A webometric analysis. Scientometrics, 91(2), 343–351.  https://doi.org/10.1007/s11192-011-0582-8.CrossRefGoogle Scholar
  2. Aguillo, I. F., Ortega, J. L., & Fernández, M. (2008). Webometric ranking of world universities: Introduction, methodology, and future developments. Higher Education in Europe, 33(2–3), 233–244.  https://doi.org/10.1080/03797720802254031.CrossRefGoogle Scholar
  3. Amara, N., Landry, R., & Halilem, N. (2015). What can university administrators do to increase the publication and citation scores of their faculty members? Scientometrics, 103(2), 489–530.  https://doi.org/10.1007/s11192-015-1537-2.CrossRefGoogle Scholar
  4. Arlitsch, K., & O’Brian, P. S. (2012). Invisible institutional repositories: Addressing the low indexing ratios of IRs in Google. Library Hi Tech, 30(1), 60–81.  https://doi.org/10.1108/07378831211213210.CrossRefGoogle Scholar
  5. Aytac, S. (2010). An examination of international scientific collaboration in a developing country (Turkey) in the post Internet era. Brookville, NY: Long Island University.Google Scholar
  6. De Winter, J. C., Zadpoor, A. A., & Dodou, D. (2014). The expansion of Google Scholar versus Web of Science: A longitudinal study. Scientometrics, 98(2), 1547–1565.  https://doi.org/10.1007/s11192-013-1089-2.CrossRefGoogle Scholar
  7. Delgado López-Cózar, E., Orduna-Malea, E., & Martín-Martín, A. (2019). Google scholar as a data source for research assessment. In W. Glänzel, H. F. Moed, U. Schmoch, & M. Thelwall (Eds.), Springer handbook of science and technology indicators. Heidelberg: Springer.Google Scholar
  8. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2016a). Empirical analysis and classification of database errors in Scopus and Web of Science. Journal of Informetrics, 10(4), 933–953.  https://doi.org/10.1016/j.joi.2016.07.003.CrossRefGoogle Scholar
  9. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2016b). The museum of errors/horrors in Scopus. Journal of Informetrics, 10(1), 174–182.  https://doi.org/10.1016/j.joi.2015.11.006.CrossRefGoogle Scholar
  10. Gusenbauer, M. (2019). Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases. Scientometrics, 118(1), 177–214.  https://doi.org/10.1007/s11192-018-2958-5.CrossRefGoogle Scholar
  11. Harzing, A. W. (2013). A preliminary test of Google Scholar as a source for citation data: a longitudinal study of Nobel prize winners. Scientometrics, 94(3), 1057–1075.  https://doi.org/10.1007/s11192-012-0777-7.CrossRefGoogle Scholar
  12. Harzing, A. W. (2014). A longitudinal study of Google Scholar coverage between 2012 and 2013. Scientometrics, 98(1), 565–575.  https://doi.org/10.1007/s11192-013-0975-y.CrossRefGoogle Scholar
  13. Hook, D. W., Porter, S. J., & Herzog, C. (2018). Dimensions: Building context for search and evaluation. Frontiers in Research Metrics and Analytics, 3, 23.  https://doi.org/10.3389/frma.2018.00023.CrossRefGoogle Scholar
  14. Jacsó, P. (2010). Metadata mega mess in Google Scholar. Online Information Review, 34(1), 175–191.  https://doi.org/10.1108/14684521011024191.CrossRefGoogle Scholar
  15. Martín-Martín, A., Orduna-Malea, E., Ayllón, J. M., & Delgado López-Cozar, E. (2016). A two-sided academic landscape: snapshot of highly-cited documents in Google Scholar (1950-2013). Revista española de documentación científica, 39(4), 1–21.  https://doi.org/10.3989/redc.2016.4.1405.CrossRefGoogle Scholar
  16. Mingers, J., & Meyer, M. (2017). Normalizing Google Scholar data for use in research evaluation. Scientometrics, 112(2), 1111–1121.  https://doi.org/10.1007/s11192-017-2415-x.CrossRefGoogle Scholar
  17. Mingers, J., O’Hanley, J. R., & Okunola, M. (2017). Using Google Scholar institutional level data to evaluate the quality of university research. Scientometrics, 113(3), 1627–1643.  https://doi.org/10.1007/s11192-017-2532-6.CrossRefGoogle Scholar
  18. Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics, 106(1), 213–228.  https://doi.org/10.1007/s11192-015-1765-5.CrossRefGoogle Scholar
  19. Moskovkin, V. M. (2009). The potential of using the Google Scholar search engine for estimating the publication activities of universities. Scientific and Technical Information Processing, 36(4), 198–202.  https://doi.org/10.3103/S0147688209040029.CrossRefGoogle Scholar
  20. Moskovkin, V. M., Delux, T., & Moskovkina, M. V. (2012). Comparative analysis of university publication activity by google scholar: (On Example of Leading Czech and Germany Universities). Cybermetrics: International Journal of Scientometrics, Informetrics and Bibliometrics, 16(1), 1–9. http://hdl.handle.net/10261/174558
  21. Orduna-Malea, E., Ayllon, J. M., Martín-Martín, A., & Delgado López-Cózar, E. (2017a). The lost academic home: Institutional affiliation links in Google Scholar Citations. Online Information Review, 41(6), 762–781.  https://doi.org/10.1108/OIR-10-2016-0302.CrossRefGoogle Scholar
  22. Orduna-Malea, E., & Delgado López-Cózar, E. (2015). The dark side of Open Access in Google and Google Scholar: The case of Latin-American repositories. Scientometrics, 102(1), 829–846.  https://doi.org/10.1007/s11192-014-1369-5.CrossRefGoogle Scholar
  23. Orduna-Malea, E., & Delgado-López-Cózar, E. (2018). Dimensions: Re-discovering the ecosystem of scientific information. El Profesional de la Información, 27(2), 420–431.  https://doi.org/10.3145/epi.2018.mar.21.CrossRefGoogle Scholar
  24. Orduña-Malea, E., Martín-Martín, A., Ayllón, Juan M., & Delgado López-Cózar, E. (2016). La revolución Google Scholar: Destapando la caja de Pandora académica. UNE: Granada.Google Scholar
  25. Orduna-Malea, E., Martín-Martín, A., & Delgado López-Cozar, E. (2017b). Google Scholar as a source for scholarly evaluation: A bibliographic review of database errors. Revista española de documentación científica, 40(4), 1–33.  https://doi.org/10.3989/redc.2017.4.1500.CrossRefGoogle Scholar
  26. Orduña-Malea, E., Serrano-Cobos, J., & Lloret-Romero, N. (2009). Las Universidades públicas españolas en Google Scholar: Presencia y evolución en su publicación académica web. El profesional de la información, 5(18), 493–501.  https://doi.org/10.3145/epi.2009.sep.02.CrossRefGoogle Scholar
  27. Ortega, J. L. (2014). Academic search engines: A quantitative outlook. Oxford: Chandos Publishing.CrossRefGoogle Scholar
  28. Ramsden, P. (1994). Describing and explaining research productivity. Higher Education, 28(2), 207–226.  https://doi.org/10.1007/BF01383729.CrossRefGoogle Scholar
  29. Ranjbar-Sahraei, B., van Eck, N. J., & de Jong, R. (2018). Accuracy of affiliation information in Microsoft Academic: Implications for institutional level research evaluation. In R. Costas, T. Franssen, & A. Yegros-Yegros (Eds.), STI 2018 conference proceedings: Proceedings of the 23rd international conference on science and technology indicators (pp. 1065–1067). Leiden: Centre for Science and Technology Studies (CWTS), Leiden University. http://hdl.handle.net/1887/65339

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  1. 1.Universitat Politècnica de ValènciaValenciaSpain
  2. 2.Long Island UniversityBrooklynUSA
  3. 3.Stony Brook UniversityStony BrookUSA

Personalised recommendations