Advertisement

The Impact of Errors in the Sсopus Database on the Research Assessment

  • I. V. SelivanovaEmail author
  • D. V. KosyakovEmail author
  • A. E. GuskovEmail author
Article
  • 33 Downloads

Abstract

This paper presents the results of the analysis of the causes for duplicate profiles in the Scopus database on the basis of a random sampling of profiles of 400 Russian authors and 400 organizations. We estimate the number of duplicate profiles and calculate the level of uncertainty that errors in bibliographic descriptions can contribute to the results of scientometric studies using the Scopus database. The analysis showed that in Scopus 76% of the organizations and 24% of the authors have duplicate profiles. In this regard, organizations lose an average of 17% of publications and authors lose 11%. The results of this study can be used in elaboration of the Scopus database and estimating the error level in the research assessment of institutions and individuals.

Keywords:

bibliographic databases Scopus identification scientometrics bibliometrics bibliographic errors ORCID 

Notes

FUNDING

This work was carried out as part of the subject of research work no. 0334-2019-006 with the support of the Russian Foundation for Basic Research, grant no. 18-011-00797.

CONFLICT OF INTEREST

The authors declare that they have no conflict of interest.

REFERENCES

  1. 1.
    Gus’kov, A.E., Kosyakov, D.V., and Selivanova, I.V., Methodology for evaluating the effectiveness of scientific organizations, Vestn. Ross. Akad. Nauk, 2018, vol. 88, no. 5, pp. 430–443.Google Scholar
  2. 2.
    Kosyakov, D. and Guskov, A., Research assessment and evaluation in Russian fundamental science, Procedia Comput. Sci., 2019, vol. 146, pp. 11–19.CrossRefGoogle Scholar
  3. 3.
    Franceschini, F., Maisano, D., and Mastrogiacomo, L., Empirical analysis and classification of database errors in Scopus and Web of Science, J. Inf., 2016, vol. 10, no. 4, pp. 933–953.Google Scholar
  4. 4.
    Franceschini, F., Maisano, D., and Mastrogiacomo, L., The museum of errors/horrors in Scopus, J. Inf., 2016, vol. 10, no. 1, pp. 174–182.Google Scholar
  5. 5.
    Buchanan, R.A., Accuracy of cited references: The role of citation databases, Coll. Res. Libr., 2006, vol. 67, no. 4, pp. 292–303.CrossRefGoogle Scholar
  6. 6.
    van Eck, N.J. and Waltman, L., Accuracy of citation data in Web of Science and Scopus, arXiv:1906.07011. https://arxiv.org/ftp/arxiv/papers/1906/1906.07011.pdf.Google Scholar
  7. 7.
    Franceschini, F., Maisano, D., and Mastrogiacomo, L., Errors in DOI indexing by bibliometric databases, Scientometrics, 2015, vol. 102, no. 3, pp. 2181–2186.CrossRefGoogle Scholar
  8. 8.
    Zhu, J., Hu, G., and Liu, W., DOI errors and possible solutions for Web of Science, Scientometrics, 2019, vol. 118, no. 2, pp. 709–718.CrossRefGoogle Scholar
  9. 9.
    Valderrama-Zurián, J.-C., Aguilar-Moya, R., Melero-Fuentes, D., and Aleixandre-Benavent, R., A systematic analysis of duplicate records in Scopus, J. Inf., 2015, vol. 9, no. 3, pp. 570–576.Google Scholar
  10. 10.
    Demetrescu, C., Ribichini, A., and Schaerf, M., Accuracy of author names in bibliographic data sources: An Italian case study, Scientometrics, 2018, vol. 117, no. 3, pp. 1777–1791.CrossRefGoogle Scholar
  11. 11.
    Ainsworth, S. and Russell, J.M., Has hosting on science direct improved the visibility of Latin American scholarly journals? A preliminary analysis of data quality, Scientometrics, 2018, vol. 115, no. 3, pp. 1463–1484.CrossRefGoogle Scholar
  12. 12.
    Aman, V., Does the Scopus author ID suffice to track scientific international mobility? A case study based on Leibniz laureates, Scientometrics, 2018, vol. 117, no. 2, pp. 705–720.CrossRefGoogle Scholar
  13. 13.
    Schulz, J., Using Monte Carlo simulations to assess the impact of author name disambiguation quality on different bibliometric analysis, Scientometrics, 2016, vol. 107, no. 3, pp. 1283–1298.CrossRefGoogle Scholar
  14. 14.
    Haak, L.L., Fenner, M., Paglione, L., Pentz, E., and Ratner, H., ORCID: A system to uniquely identify researchers, Learned Publ., 2012, vol. 25, no. 4, pp. 259–264.CrossRefGoogle Scholar
  15. 15.
    Mazov, N.A. and Gureyev, V.N., Modern challenges in bibliographic metadata identification, 3rd Russian-Pacific Conference on Computer Technology and Applications (RPC), Vladivostok, 2018, pp. 1–4.Google Scholar
  16. 16.
    Moreira, J.M., Cunha, A., and Macedo, N., An ORCID based synchronization framework for a national CRIS ecosystem, F1000Research, 2015, vol. 4, p. 181.CrossRefGoogle Scholar
  17. 17.
    Al’perin, B.L., Vedyagin, A.A., and Zibareva, I.V., SciAct—information-analytical system of the Institute of Catalysis of the Siberian Branch of the Russian Academy of Sciences for monitoring and stimulating scientific activity, Tr. Gos. Publ. Nauchno-Tekh.Bibl. Sib. Otdel. Ross. Akad. Nauk, 2015, vol. 9, pp. 95–102.Google Scholar
  18. 18.
    Kovyazina, E.V., Corporate repositories of scientific publications and problems of data exchange, Tr. Gos. Publ. Nauchno-Tekh.Bibl. Sib. Otdel. Ross. Akad. Nauk, 2016, vol. 10, pp. 288–292.Google Scholar
  19. 19.
    Zakharova, S.S. and Gureeva, Yu.A., Scientific publications: From card index to bibliographic profiles, Bibliosfera, 2017, no. 2, pp. 85–89.Google Scholar
  20. 20.
    MacEwan, A., Angjeli, A., and Gatenby, J., The international standard name identifier (ISNI): The evolving future of name authority control, Cataloging Classif. Q., 2013, vol. 51, nos. 1–3, pp. 55–71.CrossRefGoogle Scholar
  21. 21.
    Mogil’chak, E.L., Vyborochnyi metod v empiricheskom sotsiologicheskom issledovanii: Ucheb. posobie (The Selective Technique in Empirical Sociological Research: Handbook), Yekaterinburg: Ural. Univ., 2015.Google Scholar

Copyright information

© Allerton Press, Inc. 2019

Authors and Affiliations

  1. 1.State Public Scientific and Technological Library, Siberian Branch, Russian Academy of SciencesNovosibirskRussia

Personalised recommendations