Skip to main content

Abstract

In recent years, the scientific world has been experiencing a revolution: the attention of the scientific and social community is not focused solely on the results but also on the related issues of reliability, safety and efficacy of the discoveries as well as the efficient and effective use of resources. A strong debate is now going on about the reproducibility of scientific results in light of the increasing number of retractions of published articles, some of them because of obvious fraud, and others simply due to the lack of controls or poor practice. Starting from a couple of surveys made by Big Pharma about the reliability of scientific research products, it was soon pointed out in scientific journals [1–4] and even in the press [5–8] that, too often, trying to reproduce results published in peer-reviewed journals fails, even when obtained from large, experienced laboratories. While some cases of fraud have been discovered and have led to the recall of the papers, in other cases the irreproducibility could be attributed to either improper data management and processing or—more broadly—lack of good experimentation management. A single example is reported in The Economist [5]: a 2010 study published in Science, the prestigious American journal, was retracted a year later because of the strong criticisms from other geneticists: they complained about the different statistical treatments applied to the samples taken from centenarians and those from the younger control group. The authors justified the withdrawal admitting technical errors and an inadequate quality-control protocol. The number of retractions is growing faster than ever, tenfold over the past decade; this, however, represents no more than 0.2% of the 1,400,000 papers published annually in scientific journals [5]. These figures cannot account for the lack of reproducibility reported by Big Pharma, leaving the impression that the phenomenon deserves more attention and further investigation. Peer review has long been claimed to be the best way to judge the worth of scientific papers and to guarantee the validity of the scientific results presented for publishing. Experiments have been made by journalists and even by editors to test the real effectiveness of peer reviewing (see Box 2.1 for some examples). Overall, it is suggested that only a minority of reviewers carry out an in-depth analysis of the scientific results to identify possible errors. In some cases, they even appear to miss due checks, either scientific or methodological. In any case, the peer review cannot be taken as the final, effective control for inaccuracies of the research project, although much has been done and is being done by the most important scientific journals to ensure the quality of what is published. Peer review has to be accompanied, indeed preceded, by a rigorous method for conducting the studies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    LifeWatch: http://lifewatch.eu/What_is_LifeWatch. Accessed 13 Sep 2017.

References

  1. Collins FS, Tabak LA. Policy: NIH plans to enhance reproducibility. Nature. 2014;505:612–3. https://doi.org/10.1038/505612a.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Freedman LP, et al. The economics of reproducibility in preclinical research. PLoS Biol. 2015;13(6):e1002165. https://doi.org/10.1371/journal.pbio.1002165.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  3. Jasny BR, et al. Fostering reproducibility in industry—academia research—sharing can pose challenges for collaborations. Science. 2017;357(6353):759–61. https://doi.org/10.1126/science.aan4906.

    Article  PubMed  CAS  Google Scholar 

  4. Ioannidis JPA. Why most published research findings are false. PLoS Med. 2015;2(8):e124.

    Article  Google Scholar 

  5. Trouble at the lab—scientists like to think of science as self-correcting. To an alarming degree, it is not. The Economist. 2013. http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble. Accessed 13 Sep 2017.

  6. How Science goes wrong—scientific research has changed the world. Now it needs to change itself. The Economist. 2013. http://www.economist.com/news/leaders/21588069scientificresearchhaschangedworldnowitneedschangeitselfhowsciencegoeswrong. Accessed 13 Sep 2017.

  7. Achenbach J. The new scientific revolution: reproducibility at last. The Washington Post 2015. https://www.washingtonpost.com/national/health-science/the-new-scientific-revolution-reproducibility-at-last/2015/01/27/ed5f2076-9546-11e4-927a-4fa2638cd1b0_story.html?utm_term=.61cd223ff312. Accessed 26 Sep 2017.

  8. Naik G. Scientists’ elusive goal: reproducing study results. The Wall Street Journal 2011. https://www.wsj.com/articles/SB10001424052970203764804577059841672541590. Accessed 26 Sep 2017.

  9. Eanes Z. NC State University group accused of falsifying research. dailytarheel.com. 2014. http://www.dailytarheel.com:8080/article/2014/02/nc-state-university-group-accused-of-falsifying-research. Accessed 13 Sep 2017.

  10. Ségalat L. System crash. EMBO Rep. 2010;11(2):86–9. https://doi.org/10.1038/embor.2009.278.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  11. Menegon A, et al. A new electro-optical approach for conductance measurement: an assay for the study of drugs acting on ligand-gated ion channels. Sci Rep. 2017. https://doi.org/10.1038/srep44843.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Lanati, A. (2018). Introduction. In: Quality Management in Scientific Research. Springer, Cham. https://doi.org/10.1007/978-3-319-76750-5_1

Download citation

Publish with us

Policies and ethics