Scientometrics

, Volume 114, Issue 2, pp 719–734 | Cite as

Retractions covered by Retraction Watch in the 2013–2015 period: prevalence for the most productive countries

Article

Abstract

The research output of countries is among the indicators that help us understand the dynamics of science. Increasingly, these dynamics have been marked by changes in scientific communication. Researchers’ attitudes toward open science, alternative models of publication and toward originality are among the elements shaping the current scientific landscape. This changing panorama reflects on the attitude of authors, editors and publishers toward the correction of the literature, a practice that is encountered to different extents in different fields. This practice may suggest, among several issues, commitment of the scientific community to boosting the reliability of the research record. Would the research output of countries have any association with this panorama? We analyzed 1623 retractions issued in 2013–2015 and discussed in Retraction Watch (RW), www.retractionwatch.com. These retractions account for a considerable fraction of the total of retraction notices in PubMed in the same period. They were categorized by reason, field and country (that of the corresponding author). These retractions were distributed among 71 countries, with 15 countries accounting for a major share (85%)—most of those with the largest number of publications in the Scimago Journal & Country Rank (SJR). However, there is no consistent pattern for the relationship between ranking in SJR and ranking in number of retractions across countries in our RW dataset, which is skewed mostly by the fact that the RW website tends to post newsworthy retractions, with a bias toward biomedical and clinical sciences. This caveat notwithstanding, the prevalence of the most productive countries in our dataset of retractions is worth noting. Gradually, retractions have been permeating the dynamics of research productivity in many countries but, so far, there is limited knowledge of this interaction. We believe it should be further explored.

Keywords

Retractions Correction of the literature Science indicators Research output 

Notes

Acknowledgements

We thank Ivan Oransky for useful information on particularities of the Retraction Watch website. Miguel Roig and Martha Sorenson are also acknowledged for their suggestions and critical reading of the manuscript. We also thank Alison Abritis who made valuable commentary in earlier versions. The interpretation of the data is the sole responsibility of the authors. The first author acknowledges support from the Coordination for the Advancement of Higher Education Personnel (CAPES).

References

  1. Academy of Medical Science. (2015). Reproducibility and reliability of biomedical research: Improving research practice. Technical report.Google Scholar
  2. Aksnes, D. W., & Sivertsen, G. (2004). The effect of highly cited papers on national citation indicators. Scientometrics, 59(2), 213–224.CrossRefGoogle Scholar
  3. Amos, K. A. (2014). The ethics of scholarly publishing: Exploring differences in plagiarism and duplicate publication across nations. Journal of the Medical Library Association, 102(2), 87–91.CrossRefGoogle Scholar
  4. Anderson, M. S., Horn, A. S., Risbey, K. R., Ronning, E. A., De Vries, R., & Martinson, B. C. (2007a). What do mentoring and training in the responsible conduct of research have to do with scientists’ misbehavior? Findings from a national survey of NIH-funded scientists. Academic Medicine, 82(9), 853–860.CrossRefGoogle Scholar
  5. Anderson, M. S., Ronning, E. A., De Vries, R., & Martinson, B. C. (2007b). The perverse effects of competition on scientists’ work and relationships. Science and Engineering Ethics, 13(4), 437–461.CrossRefGoogle Scholar
  6. Azoulay, P., Bonatti, A., & Krieger, J. L. (2017). The career effects of scandal: Evidence from scientific retractions. Research Policy, 46(9), 1552–1569.CrossRefGoogle Scholar
  7. Azoulay, P., Furman, J. L., Krieger, J. L., & Murray, F. E. (2012). Retractions. Review of Economics and Statistics, 97(5), 1118–1136.CrossRefGoogle Scholar
  8. Barbour, V., Bloom, T., Lin, J., & Moylan, E. (2017). Amending published articles: Time to rethink retractions and corrections? Biorxiv.  https://doi.org/10.1101/118356.Google Scholar
  9. Basu, A. (2006). Using ISI’s ‘Highly Cited Researchers’ to obtain a country level indicator of citation excellence. Scientometrics, 68(3), 361–375.CrossRefGoogle Scholar
  10. Batty, M. (2003). The geography of scientific citation. Environment and Planning A, 35(5), 761–770.CrossRefGoogle Scholar
  11. Bohannon, J. (2014). Study of massive prepint archive hints at the geography of plagiarism. Science. http://www.sciencemag.org/news/2014/12/study-massive-preprint-archive-hints-geography-plagiarism. Accessed in November 2017.
  12. Butler, N., Delaney, H., Spoelstra, S. (2015).The grey zone: How questionable research practices are blurring the boundary between science and misconduct. Times Higher Education https://www.timeshighereducation.com/blog/grey-zone-how-questionable-research-practices-are-blurring-boundary-between-science-and. Accessed in November 2017.
  13. Casadevall, A., & Fang, F. C. (2012). Reforming science: Methodological and cultural reforms. Infection and Immunity, 80(3), 891–896.CrossRefGoogle Scholar
  14. Citron, D. T., & Ginsparg, P. (2014). Patterns of text reuse in a scientific corpus. Proceedings of the National Academy of Sciences, 112(1), 25–30.CrossRefGoogle Scholar
  15. Cokol, M., Iossifov, I., Rodriguez-Esteban, R., & Rzhetsky, A. (2007). How many scientific papers should be retracted? EMBO Reports, 8(5), 422–423.CrossRefGoogle Scholar
  16. Cokol, M., Ozbay, F., & Rodriguez-Esteban, R. (2008). Retraction rates are on the rise. EMBO Reports, 9(1), 2.CrossRefGoogle Scholar
  17. Colquhoun, D. (2011). Publish or perish: Peer review and the corruption of science. The Guardian, London, United Kingdom. https://www.theguardian.com/science/2011/sep/05/publish-perish-peer-review-science. Accessed in April 2017.
  18. Corbyn, Z. (2009). Retractions up tenfold. Higher Education: Times. www.timeshighereducation.co.uk/407838.article. Accessed in April 2017.
  19. Davis, M. S. (2003). The role of culture in research misconduct. Accountability in Research, 10(3), 189–201.CrossRefGoogle Scholar
  20. Davis, M. S., Riske-Morris, M., & Diaz, S. R. (2007). Causal factors implicated in research misconduct: Evidence from ORI case files. Science and Engineering Ethics, 13(4), 395–414.CrossRefGoogle Scholar
  21. DuBois, J. M., Anderson, E. E., Chibnall, J., Carroll, K., Gibb, T., Ogbuka, C., et al. (2013). Understanding research misconduct: A comparative analysis of 120 cases of professional wrongdoing. Accountability in Research, 20(5–6), 320–338.CrossRefGoogle Scholar
  22. Durieux, V., & Gevenois, P. A. (2010). Bibliometric indicators: Quality measurements of scientific publications. Radiology, 255(2), 342–351.CrossRefGoogle Scholar
  23. Durkheim, E. (1893). The division of labor in society. http://durkheim.uchicago.edu/Summaries/dl.html. Accessed in October 2017.
  24. Enserink, M. (2017). How to avoid the stigma of a retracted paper? Don’t call it a retraction. Science. http://www.sciencemag.org/news/2017/06/how-avoid-stigma-retracted-paper-dont-call-it-retraction. Accessed in November 2017.
  25. Fanelli, D. (2013). Why growing retractions are (mostly) a good sign. PLoS Medicine.  https://doi.org/10.1371/journal.pmed.1001563.Google Scholar
  26. Fanelli, D., Costas, R., & Larivière, V. (2015). Misconduct policies, academic culture and career stage, not gender or pressures to publish, affect scientific integrity. PLoS ONE.  https://doi.org/10.1371/journal.pone.0127556.Google Scholar
  27. Fang, F. C., & Casadevall, A. (2011). Retracted science and the retraction index. Infection and Immunity, 79(10), 3855–3859.CrossRefGoogle Scholar
  28. Fang, F., Grant, S. C. R., & Casadevall, A. (2012). Misconduct accounts for the majority of retracted scientific publications. Proceedings of the National Academy of Sciences, 109(42), 17028–17033.CrossRefGoogle Scholar
  29. Garfield, E. (1955). Citation indexes for science: A new dimension in documentation through association of ideas. Science, 122(3159), 108–111.CrossRefGoogle Scholar
  30. Garfield, E., Malin, M. V., Small, H. (1983). Essays of an Information Scientist (Vol. 6, p. 580) (Reprinted from Toward a metric of science: The advent of science indicators, by Y. Elkana, J. Lederberg, R. K. Merton, A. Thackray & H. Zuckerman, Eds., 1978, NY: John Wiley & Sons). http://garfield.library.upenn.edu/essays/v6p580y1983.pdf.
  31. Garfield, E. (1987). The Anomie-Deviant Behavior Connection: The Theories of Durkheim, Merton, and Srole. Current Contents, 39. In Essays of an Information Scientist (Vol. 10, pp. 272–281). http://www.garfield.library.upenn.edu/essays/v10p272y1987.pdf.
  32. Garfield, E. (1989). Evaluating research: Do bibliometric indicators provide the best measures? http://www.garfield.library.upenn.edu/essays/v12p093y1989.pdf. Accessed in April 2017.
  33. Garfield, E. (1996). What is the primordial reference for the phrase 'publish or perish'? The Scientist, 10(12), 11.Google Scholar
  34. Garfield, E. (1999). Journal impact factor: A brief review. Canadian Medical Association Journal, 161(8), 979–980.Google Scholar
  35. Gevers, M. (2014). Scientific performance indicators: A critical appraisal and a country-by-country analysis. http://www.portlandpresspublishing.com/sites/default/files/Editorial/Wenner/WG_87/WG_87_chapter%205.pdf. Accessed in April 2017.
  36. Giofrè, D., Cumming, G., Fresc, L., Boedker, I., & Tressoldi, P. (2017). The influence of journal submission guidelines on authors' reporting of statistics and use of open research practices. PLoS One.  https://doi.org/10.1371/journal.pone.0175583.
  37. Glänzel, W., Beck, R., Milzow, K., et al. (2016). Data collection and use in research funding and performing organisations. General outlines and first results of a project launched by Science Europe. Scientometrics, 106(2), 825–835.CrossRefGoogle Scholar
  38. Graf, C., Wager, E., Bowman, A., Fiack, S., Scott-Lichter, D., & Robinson, A. (2007). Best practice guidelines on publication ethics: A publisher’s perspective. International Journal of Clinical Practice, 61(152), 1–26.Google Scholar
  39. Grieneisen, M. L., & Zhang, M. (2012). A comprehensive survey of retracted articles from the scholarly literature. PLoS ONE.  https://doi.org/10.1371/journal.pone.0044118.Google Scholar
  40. He, T. (2013). Retraction of global scientific publications from 2001 to 2010. Scientometrics, 96(2), 555–561.CrossRefGoogle Scholar
  41. Hesselmann, F., Graf, V., Schmidt, M., & Reinhart, M. (2017). The visibility of scientific misconduct: A review of the literature on retracted journal articles. Current Sociology Review, 65(6), 814–845.CrossRefGoogle Scholar
  42. International Commitee of Medical Journal Editors (ICMJE). (2017). Defining the role of authors and contributors. http://www.icmje.org/recommendations/browse/roles-and-responsibilities/defining-the-role-of-authors-and-contributors.html. Accessed in November 2017.
  43. John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532.CrossRefGoogle Scholar
  44. Journal Citation Reports (JCR). https://jcr.incites.thomsonreuters.com/.
  45. King, J. (1987). A review of bibliometric and other science indicators and their role in research evaluation. Journal of Information Science, 13(5), 267–276. http://www.garfield.library.upenn.edu/essays/v12p093y1989.pdf. Accessed in April 2017.
  46. Lancet. (2015). Correcting the scientific literature: Retraction and republication. Lancet, 385(9966), 394.Google Scholar
  47. Lu, S. F., Jin, G. Z., Uzzi, B., & Jones, B. (2013). The retraction penalty: Evidence from the Web of Science. Scientific Reports, 3(3146), 1–5.Google Scholar
  48. Marcus, A., & Oranksy, I. (2014). What studies about retractions tell us. Journal of Microbiology & Biology Education, 15(2), 151–154.CrossRefGoogle Scholar
  49. Martinson, B. C., Anderson, M. S., & de Vries, R. (2007). Scientists behaving badly. Nature, 435(7043), 737–738.CrossRefGoogle Scholar
  50. Meo, S. A., Masri, A. A. A., Usmani, A. M., Memon, A. N., & Zaidi, S. Z. (2013). Impact of GDP, spending on R&D, number of universities and scientific journals on research publications among asian countries. PLoS ONE, 8(6), e66449.CrossRefGoogle Scholar
  51. Merton, R. K. (1968). Social theory and social structure. http://garfield.library.upenn.edu/classics1980/A1980JS04600001.pdf. Accessed in October 2017.
  52. OECD. (1994). The measurement of scientific and technological activities. http://www.oecd-ilibrary.org/docserver/download/9294041e.pdf?expires=1508687835&id=id&accname=guest&checksum=4DD43855CEB9E9D188A379D6465DBC84. Accessed in September 2017.
  53. OSTP. (2000). Federal policy on research misconduct. Federal Register, 65, 76260–76264.Google Scholar
  54. PLOS One. (2017). Authorship. http://journals.plos.org/plosone/s/authorship. Accessed in November 2017.
  55. Qiu, J. (2010). Publish or perish in China. Nature, 463, 142–143.CrossRefGoogle Scholar
  56. Resnik, D. B. (2009). Scientific research and the public trust. Science and Engineering Ethics, 17(3), 399–409.CrossRefGoogle Scholar
  57. Resnik, D. B., Neal, T., Raymond, A., & Kissling, G. E. (2015). Research misconduct definitions adopted by U.S. research institutions. Accounting Research, 22(1), 14–21.Google Scholar
  58. Retraction Watch. www.retractionwatch.com.
  59. Retraction Watch. James Hunton archive. http://retractionwatch.com/category/by-author/james-hunton/.
  60. Retraction Watch. Diederik Stapel archive. http://retractionwatch.com/category/by-author/diederik-stapel/.
  61. Roig, M. (2010). Plagiarism and self-plagiarism: What every author should know. Biochemia Medica, 20(3), 295–300.CrossRefGoogle Scholar
  62. RPubs. (2017). PubMed retractions report. https://rpubs.com/neilfws/65778. Accessed in November 2017.
  63. Science Europe. (2015). Research integrity: What it means, why it is important and how we might protect it’: D/2015/13.324/9. http://www.scienceeurope.org/wp-content/uploads/2015/12/Briefing_Paper_Research_Integrity_web.pdf. Accessed in December 2017.
  64. SCImago, SJR: Scimago Journal & Country Rank. http://www.scimagojr.com.
  65. Starovoytova, D. (2017). Plagiarism under a magnifying-glass. Journal of Education and Practice, 8(15), 109–129.Google Scholar
  66. Steen, R. G. (2011). Retractions in the scientific literature: Do authors deliberately commit research fraud? Journal of Medical Ethics, 37(2), 113–117.CrossRefGoogle Scholar
  67. Steen, R. G., Casadevall, A., & Fang, F. (2013). Why has the number of scientific retractions increased? PLoS ONE.  https://doi.org/10.1371/journal.pone.0068397.Google Scholar
  68. Tijdink, J. K., Verbeke, R., & Smulders, Y. M. (2014). Publication pressure and scientific misconduct in medical scientists. Journal of Empirical Research on Human Research Ethics, 9(5), 64–71.CrossRefGoogle Scholar
  69. Van Noorden, R. (2011). Science publishing: The trouble with retractions. Nature, 478, 26–28.CrossRefGoogle Scholar
  70. Vasconcelos, S., Leta, J., Costa, L., Pinto, A., & Sorenson, M. (2009). Discussing plagiarism in Latin American science. EMBO Reports, 10(7), 677–682.CrossRefGoogle Scholar
  71. Wager, E., Barbour, V., Yentis, S., Kleinert, S., & on behalf of COPE Council. (2009). Retractions: Guidance from the Committee on Publication Ethics (COPE). Croatian Medical Journal, 50(6), 532–535.CrossRefGoogle Scholar
  72. Zhang, M., & Grieneisen, M. L. (2013). The impact of misconduct on the published medical and non-medical literature, and the news media. Scientometrics, 96(2), 573–587.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2018

Authors and Affiliations

  1. 1.Science Education Program, Institute of Medical Biochemistry Leopoldo de MeisFederal University of Rio de JaneiroRio de JaneiroBrazil

Personalised recommendations