Advertisement

Scientometrics

, Volume 115, Issue 3, pp 1463–1484 | Cite as

Has hosting on science direct improved the visibility of Latin American scholarly journals? A preliminary analysis of data quality

  • Shirley Ainsworth
  • Jane M. Russell
Article

Abstract

Latin American regional journals have adopted measures to increase both their quality and visibility, among which figures the promotion of their inclusion in international databases such as Scopus and Web of Science. An increasing number have recently taken advantage of the support and hosting services offered by Elsevier on the Science Direct (SD) platform, with the expectation of achieving this goal. The present study takes a preliminary look at the tendencies in the coverage in these databases of a sample of open access Latin American journals hosted on SD in June 2016 and their metadata, to typify the most common errors which can affect the use of performance indicators in individual and institutional evaluations and the integration of authors into scholarly reputation systems. A significant level of these errors was found relating to such aspects as journal coverage, duplicate records, reference data and links to full text, in addition to conventions concerning author names and titles that adversely affect retrieval of articles and the correct assignation of author credit and of citations, suggesting that present strategies have yet to deliver the expected results.

Keywords

Open access journals Database errors Journal visibility Scopus Web of Science SciELO Citation Index 

Notes

Acknowlegements

The authors would like to thank two anonymous reviewers whose comments and suggestions improved the manuscript. Preliminary results of this paper were presented at the International Seminar of Quantitative and Qualitative Studies of Science and Technologies “Prof. Gilberto Sotolongo Aguilar” 2016, Cuba.

References

  1. Alperin, J. P. (2014). South America: Citation databases omit local journals. Correspondence. Nature, 511(7508), 155.  https://doi.org/10.1038/511155c.CrossRefGoogle Scholar
  2. Alperin, J. P., Babini, D., & Fischman, G. E. (Eds.) (2014). Open access indicators and scholarly communications in Latin America. Buenos Aires: CLACSO. http://biblioteca.clacso.edu.ar/clacso/se/20140917054406/OpenAccess.pdf Accessed 9 March 2017.
  3. Archambault, E., Campbell, D., Gingras, Y., & Lariviére, V. (2009). Comparing bibliometric statistics obtained from the Web of Science and Scopus. Journal of the American Society for Information Science and Technology, 60(7), 1320–1326.  https://doi.org/10.1002/asi.21062.CrossRefGoogle Scholar
  4. Arencibia-Jorge, R., Villaseñor, E. A., Lozano-Díaz, I. A., & Carrillo Calvet, H. (2016). Elseviers’s journal metrics for the identification of a mainstream journals core: A case study on Mexico. Libres, 26(1), 1–13. https://cpb-us-east-1-juc1ugur1qwqqqo4.stackpathdns.com/blogs.ntu.edu.sg/dist/8/644/files/2016/11/LIBRESv26i1p1-13.Arencibia-Jorge.2016-t8c06h.pdf.
  5. Black, B. (2003). Indexing the names of authors from Spanish- and Portuguese-speaking countries. Science Editor 26(4), 118–121. https://www.councilscienceeditors.org/wp-content/uploads/v26n4p118-121.pdf.
  6. Bordons, M., & Gómez, I. (2004). Towards a single language in science? A Spanish view. Serials, 17(2), 189–195.  https://doi.org/10.1629/17189.CrossRefGoogle Scholar
  7. Chavarro, D., Ràfols, I., & Tang, P (2017a). To what extent is inclusion in the Web of Science an indicator of journal ‘quality’? https://ssrn.com/abstract=2990653 Accessed 8 December 2017.
  8. Chavarro, D., Tang, P., & Ràfols, I. (2017b). Why researchers publish in non-mainstream journals: training, knowledge bridging, and gap filling. Research Policy, 46(9), 1666–1680.  https://doi.org/10.1016/j.respol.2017.08.002.CrossRefGoogle Scholar
  9. Collazo-Reyes, F. (2014). Growth of the number of indexed journals of Latin America and the Caribbean: The effect on the impact of each country. Scientometrics, 98(1), 197–209.  https://doi.org/10.1007/s11192-013-1036-2.CrossRefGoogle Scholar
  10. Collazo-Reyes, F., Luna-Morales, M. A., & Luna-Morales, E. (2017). Change in the publishing regime in Latin America: From a local to universal journal, Archivos de Investigación Médica/Archives of Medical Research (1970–2014). Scientometrics, 110(2), 695–709.  https://doi.org/10.1007/s11192-016-2207-8.CrossRefGoogle Scholar
  11. Committee on Publication Ethics (COPE) Guidelines. https://publicationethics.org/ Accessed 10 January 2018.
  12. Donner, P. (2017). Document type assignment accuracy in the journal citation index data of Web of Science. Scientometrics, 113(1), 219–236.  https://doi.org/10.1007/s11192-017-2483-y.CrossRefGoogle Scholar
  13. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2013). A novel approach for estimating the omitted-citation rate of bibliometric databases with an application to the field of bibliometrics. Journal of the American Society for Information Science and Technology, 64(10), 2149–2156.  https://doi.org/10.1002/asi.22898.CrossRefGoogle Scholar
  14. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2014). Scientific journal publishers and omitted citations in bibliometric databases: Any relationship? Journal of Informetrics, 8(3), 751–765.  https://doi.org/10.1016/j.joi.2014.07.003.CrossRefGoogle Scholar
  15. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2015a). Research quality evaluation: Comparing citation counts considering bibliometric database errors. Quality & Quantity, 49(1), 155–165.  https://doi.org/10.1007/s11135-013-9979-1.CrossRefGoogle Scholar
  16. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2015b). Errors in DOI indexing by bibliometric databases. Scientometrics, 102(3), 2181–2186.  https://doi.org/10.1007/s11192-014-1503-4.CrossRefGoogle Scholar
  17. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2015c). Influence of omitted citations on bibliometric statistics of the major manufacturing journals. Scientometrics, 103(3), 1083–1122.  https://doi.org/10.1007/s11192-015-1583-9.CrossRefGoogle Scholar
  18. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2016a). The museum of errors/horrors in Scopus. Journal of Informetrics, 10(1), 174–182.  https://doi.org/10.1016/j.joi.2015.11.006.CrossRefGoogle Scholar
  19. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2016b). Empirical analysis and classification of database errors in Scopus and Web of Science. Journal of Informetrics, 10(4), 933–953.  https://doi.org/10.1016/j.joi.2016.07.003.CrossRefGoogle Scholar
  20. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2016c). Do Scopus and WoS correct “old” omitted citations? Scientometrics, 107(2), 321–335.  https://doi.org/10.1007/s11192-016-1867-8.CrossRefGoogle Scholar
  21. García-Pérez, M. A. (2010). Accuracy and completeness of publication and citation records in the Web of Science, PsycINFO, and Google Scholar: A case study for the computation of h indices in Psychology. Journal of the American Society for Information Science and Technology, 61(10), 2070–2085.  https://doi.org/10.1002/asi.21372.CrossRefGoogle Scholar
  22. Jacsó, P. (2010). Metadata mess in Google Scholar. Online Information Review, 34(1), 175–191.  https://doi.org/10.1108/14684521011024191.CrossRefGoogle Scholar
  23. Krauskopf, E. (2017). Call for caution in the use of bibliometric data. Journal of the Association for Information Science and Technology, 68(8), 2029–2032.  https://doi.org/10.1002/asi.23809.CrossRefGoogle Scholar
  24. Liang, L., Zhong, Z., & Rousseau, R. (2014). Scientists’ referencing (mis)behavior revealed by the dissemination network of referencing errors. Scientometrics, 101(3), 1973–1986.  https://doi.org/10.1007/s11192-014-1275-x.CrossRefGoogle Scholar
  25. Meester, W. J. N., Colledge, L., & Dyas, E. E. A. (2016). A response to “The museum of errors, horrors in Scopus” by Franceschini et al. Journal of Informetrics, 10(2), 569–570.  https://doi.org/10.1016/j.joi.2016.04.011.CrossRefGoogle Scholar
  26. Meneghini, R., Mugnaini, R., & Packer, A. L. (2006). International versus national oriented Brazilian scientific journals. A scientometric analysis based on SciELO and JCR-ISI databases. Scientometrics, 69(3), 529–538.  https://doi.org/10.1007/s11192-006-0168-z.CrossRefGoogle Scholar
  27. Meneghini, R., & Packer, A. L. (2007). Is there science beyond English? EMBO Reports, 8(2), 112–116.  https://doi.org/10.1038/sj.embor.7400906.CrossRefGoogle Scholar
  28. Miguel, S., González, C., & Chichilla-Rodríguez, Z. (2015). Lo local y lo global en la producción científica argentina con visibilidad en Scopus, 2008–2012. Dimensiones nacionales e internacionales de la investigación. Información, Cultura y Sociedad, 32, 59–78. http://www.scielo.org.ar/scielo.php?script=sci_arttext&pid=S1851-17402015000100004.
  29. Moed, H. F., Bar-Ilan, J., & Halevi, G. (2016). A new methodology for comparing Google Scholar and Scopus. Journal of Informetrics, 10(2), 533–551.  https://doi.org/10.1016/j.joi.2016.04.017.CrossRefGoogle Scholar
  30. Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics, 106(1), 213–228.  https://doi.org/10.1007/s11192-015-1765-5.CrossRefGoogle Scholar
  31. Morrison, H. (2017). Elsevier: Among the world’s largest open access publishers as of 2016. The Charleston Adviser, 18(3), 53–59.  https://doi.org/10.5260/chara.18.3.53.CrossRefGoogle Scholar
  32. Olensky, M. (2012). How is bibliographic data accuracy assessed? In STI 2012: International conference on science and technology indicators. http://2012.sticonference.org/Proceedings/vol2/Olensky_Bibliographic_628.pdf Accessed 21 August 2017.
  33. Packer, A. L. (2014). SciELO Citation Index en el Web of Science. SciELO en Perspectiva. http://blog.scielo.org/es/2014/02/28/scielo-citation-index-en-el-web-of-science/ Accessed 27 July 2016.
  34. Priego, E., McKiernan, E., Posada, A., Hartley, R., Rodríguez Ortega, N., Fiormonte, D., Gil, A., Logan, C., Alperin, J. P., Mounce, R., Eglen, S. J., Miranda Trigueros, E., Lawson, S., Gatto, L., Ramos, A., & Pérez, N. (2017). Scholarly Publishing, Freedom of Information and Academic Self-Determination: The UNAM-Elsevier Case. Figshare, http://doi.org/10.6084/m9.figshare.5632657.v1 Accessed 3 January 2018.
  35. Prins, A. A. M., Costas, R., van Leeuwen, T. N., & Wouters, P. (2016). Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data. Research Evaluation, 25(3), 264–270.  https://doi.org/10.1093/reseval/rvv049.CrossRefGoogle Scholar
  36. Ruíz-Pérez, R., Delgado López-Cózar, E., & Jiménez-Contreras, E. (2002). Spanish personal name variations in national and international biomedical databases: Implications for information retrieval and bibliometric studies. Journal of the Medical Library Association, 90(4), 411–430. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC128958/.
  37. Schulz, J. (2016). Using Monte Carlo simulations to assess the impact of author name disambiguation quality on different bibliometric analysis. Scientometrics, 107(3), 1283–1298.  https://doi.org/10.1007/s11192-016-1892-7.CrossRefGoogle Scholar
  38. Sweetland, J. H. (1989). Errors in bibliographic citations: a continuing problem. The Library Quarterly, 59(4), 291–304.  https://doi.org/10.1086/602160.CrossRefGoogle Scholar
  39. Taşkin, Z., & Al, U. (2014). Standardization problem of author affiliations in citation indexes. Scientometrics, 98(1), 347–368.  https://doi.org/10.1007/s11192-013-1004-x.CrossRefGoogle Scholar
  40. Tunger, D., Haustein, S., Rupert, L, Luca, G., & Unterhalt, S. (2010). “The Delphic Oracle”—an analysis of potential error sources in bibliographic databases. In STI 2010: 11th international conference on science and technology indicators. Book of Abstracts. pp. 282–283. https://www.researchgate.net/publication/234137490_The_Delpic_Oracle_-_An_analysis_of_potential_error_sources_in_bibliographic_databases. Accessed 22 August 2017.
  41. Valderrama-Zurián, J.-C., Aguilar-Moya, R., Melero-Fuentes, D., & Aleixandre-Benavent, R. (2015). A systematic analysis of duplicate records in Scopus. Journal of Informetrics, 9(3), 570–576.  https://doi.org/10.1016/j.joi.2015.05.002.CrossRefGoogle Scholar
  42. Vélez-Cuartas, G., Lucio-Arias, D., & Leyesdorff, L. (2016). Regional and global science: Publications from Latin America and the Caribbean in the SciELO Citation Index and the Web of Science. El Profesional de la Información, 25(1), 35–46.  https://doi.org/10.3145/epi.2016.ene.05.
  43. Vessuri, H., Guédon, J. C., & Cetto, A. M. (2014). Excellence or quality? Impact of the current competition regime on science and scientific publishing in Latin American and its implications for development. Current Sociology, 62(5), 647–665.  https://doi.org/10.1177/0011392113512839.CrossRefGoogle Scholar
  44. Wang, Q., & Waltman, L. (2016). Large-scale analysis of the accuracy of the journal classification scheme of Web of Science and Scopus. Journal of Informetrics, 10(2), 347–364.  https://doi.org/10.1016/j.joi.2016.02.003.CrossRefGoogle Scholar
  45. Ware, M. (2008). Choosing a publishing partner: Advice for societies and associations. Learned Publishing, 20(1), 20–28.  https://doi.org/10.1087/095315108x248329.Google Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2018

Authors and Affiliations

  1. 1.Instituto de Biotecnología, Universidad Nacional Autónoma de MéxicoCuernavacaMexico
  2. 2.Instituto de Energías Renovables, Universidad Nacional Autónoma de MéxicoTemixcoMexico

Personalised recommendations