2.1 Funders/Granting Agencies/Policy Makers
Many funders and policy makers have acknowledged the issue of irreproducibility and are developing new guidelines and initiatives to support the generation of data that are robust and reproducible. This section highlights guidelines, policies, and resources directly related to this issue in preclinical research by the major international granting institutions and is not intended to be an exhaustive review of all available guidelines, policies, and resources. Instead, the organizations reviewed represent a cross-section of many of the top funding organizations and publishers in granting volume and visibility. Included also is a network focused specifically on robustness, reproducibility, translatability, and reporting transparency of preclinical data with membership spanning across academia, industry, and publishing. Requirements pertaining to clinical research are included when guidance documents are also used for preclinical research. The funders, granting agencies, and policy makers surveyed included:
National Institutes of Health (NIH) (Collins and Tabak 2014; LI-COR 2018; Krester et al. 2017; NIH 2015, 2018a, b)
Medical Research Council (MRC) (Medical Research Council 2012a, b, 2016a, b, 2019a, b, c)
The World Health Organization (WHO) (World Health Organization 2006, 2010a, b, 2019)
Wellcome Trust (Wellcome Trust 2015, 2016a, b, 2018a, b, 2019a, b; The Academy of Medical Sciences 2015, 2016a, b; Universities UK 2012)
Canadian Institute of Health Research (CIHR) (Canadian Institutes of Health Research 2017a, b)
Deutsche Forschungsgemeinschaft (DFG)/German Research Foundation (Deutsche Forschungsgemeinschaft 2015, 2017a, b)
European Commission (EC) (European Commission 2018a, b; Orion Open Science 2019)
Institut National de la Santé et de la Recherche Médicale (INSERM) (French Institute of Health and Medical Research 2017; Brizzi and Dupre 2017)
US Department of Defense (DoD) (Department of Defense 2017a, b; National Institutes of Health Center for Information Technology 2019)
Cancer Research UK (CRUK) (Cancer Research UK 2018a, b, c)
National Health and Medical Research Council (NHMRC) (National Health and Medical Research Council 2018a, b, 2019; Boon and Leves 2015)
Center for Open Science (COS) (Open Science Foundation 2019a, b, c, d; Aalbersberg 2017)
Howard Hughes Medical Institute (HHMI) (ASAPbio 2018)
Bill & Melinda Gates Foundation (Gates Open Research 2019a, b, c, d, e)
Innovative Medicines Initiative (IMI) (Innovative Medicines Initiative 2017, 2018; Community Research and Development Information Service 2017; European Commission 2017)
Preclinical Data Forum Network (European College of Neuropsychopharmacology 2019a, b, c, d)
2.2 Publishers/Journal Groups
Journal publishers and groups have been revising author instructions and publication policies and guidelines, with an emphasis on detailed reporting of study design, replicates, statistical analyses, reagent identification, and validation. Such revisions are expected to encourage researchers to publish robust and reproducible data (National Institutes of Health 2017). Those publishers and groups considered in the analysis were:
NIH Publication Guidelines Endorsed by Journal Groups (Open Science Foundation 2019d)
Transparency and Openness Promotion (TOP) Guidelines for Journals (Open Science Foundation 2019d; Nature 2013)
Nature Journal (Nature 2017, 2019; Pattinson 2012)
PLOS ONE Journal (The Science Exchange Network 2019a, b; Fulmer 2012; Baker 2012; Powers 2019; PLOS ONE 2017a, b, 2019a, b, c; Bloom et al. 2014; Denker et al. 2017; Denker 2016)
Journal of Cell Biology (JCB) (Yamada and Hall 2015)
Elsevier (Cousijn and Fennell 2017; Elsevier 2018, 2019a, b, c, d, e, f, g; Scholarly Link eXchange 2019; Australian National Data Service 2018)
2.3 Summary of Overarching Themes
Guidelines implemented by funding bodies and publishers/journals to attain data reproducibility can take on many forms. Many agencies prefer to frame their guidelines as recommendations in order to accommodate scientific freedom, creativity, and innovation. Therefore, typical guidelines that support good research practices differ from principles set forth by good laboratory practices, which are based on a more formal framework and tend to be more prescriptive.
In reviewing current guidelines and initiatives around reproducibility and robustness, key areas that can lead to robust and reproducible research were revealed and are discussed below.
Research Design and Analysis
Providing a well-defined research framework and statistical plan before initiating the research reduces bias and thus helps to increase the robustness and reproducibility of the study.
Funders have under taken various initiatives to support robust research design and analysis, including developing guidance on granting applications. These require researchers to address a set of objectives in the grant proposal including the strengths and weakness of the research, details on the experimental design and methods of the study, planned statistical analyses, and sample sizes. In addition, researchers are often required to abide by existing reporting guidelines such as ARRIVE and asked to provide associated metadata.
Some funders, including NIH, DFG, NHMRC, and HHMI, have developed well-defined guidance documents focusing on robustness and reproducibility for applicants, while others, including Wellcome Trust and USDA, have started taking additional approaches to implement such guidelines. For instance, a symposium was held by Wellcome Trust, while USDA held an internal meeting to identify approaches and discuss solutions to include strong study designs and develop rigorous study plans.
As another example, a dedicated annexure, “Reproducibility and statistical design annex,” is required from the researchers in MRC-funded research projects to provide information on methodology and experimental design.
Apart from funders, journals are also working to improve study design quality and reporting, such as requiring that authors complete an editorial checklist before submitting their research in order to enhance the transparency of reporting and thus the reproducibility of published results. Nearly all journals, including Nature Journal of Cell Biology, and PLOS ONE and the major journal publisher Elsevier have introduced this requirement.
Some journals are also prototyping alternate review models such as early publication to help verify study design. For instance, in Elsevier’s Registered Reports initiative, the experimental methods and proposed analyses are preregistered and reviewed before study data is collected. The article gets published on the basis of its study protocol and thus prevents authors from modifying their experiments or excluding essential information on null or negative results in order to get their articles published. However, this has been implemented in a limited number of journals in the Elsevier portfolio. PLOS ONE permits researchers to submit their articles before a peer review process is conducted. This allows researchers/authors to seek feedback on draft manuscripts before or in parallel to formal review or submission to the journal.
Training and Support
Providing adequate training to researchers on the importance of robust study design and experimental methods can help to capture relevant information crucial to attaining reproducibility.
Funders such as MRC have deployed training programs to train both researchers and new panel members on the importance of experimental design and statistics and on the importance of having robust and reproducible research results.
In addition to a detailed guidance handbook for biomedical research, WHO has produced separate, comprehensive training manuals for both trainers and trainees to learn how to implement their guidelines. Also, of note, the Preclinical Data Forum Network, sponsored by the European College of Neuropsychopharmacology (European College of Neuropsychopharmacology 2019e) in Europe and Cohen Veterans Bioscience (Cohen Veterans Bioscience 2019) in the United States, organizes yearly training workshops to enhance awareness and to help junior scientists further develop their experimental skills, with prime focus on experimental design to generate high-quality, robust, reproducible, and relevant data.
Reagents and Reference Material
Developing standards for laboratory reagents are essential to maintain reproducibility.
Funders such as HHMI require researchers to make all tangible research materials including organisms, cell lines, plasmids, or similar materials integral to a publication through a repository or by sending them out directly to requestors.
Providing detailed laboratory protocols is required to reproduce a study. Otherwise, researchers may introduce process variability when attempting to reproduce the protocol in their own laboratories. These protocols can also be used by reviewers and editors during the peer review process or by researchers to compare methodological details between laboratories pursuing similar approaches.
Funders such as INSERM took the initiative to introduce an electronic lab book. This platform provides better research services by digitizing the experimental work. This enables researchers to better trace and track the data and procedures used in experiments.
Journals such as PLOS ONE have taken an initiative wherein authors can deposit their laboratory protocols on repositories such as protocols.io. A unique digital object identifier (DOI) is assigned to each study and linked to the Methods section of the original article, allowing researchers to access the published work of these authors along with the detailed protocols used to obtain the results.
Reporting and Review
Providing open and transparent access to the research findings and study methods and publishing null or negative results associated with a study facilitate data reproducibility.
Funders require authors to report, cite, and store study data in its entirety, and have developed various initiatives to facilitate data sharing. For instance, CIHR and NHMRC have implemented an open access policy, which requires researchers to store their data in specific repositories to improve discovery and facilitate interaction among researchers, gain Creative Commons Attribution license (CC BY) for their research to allow other researchers to access and use the data in parts or as a whole, and link their research activities via identifiers such as digital object identifiers (DOIs) and ORCID to allow appropriate citation of datasets and provide recognition to data generators and sharers.
Wellcome Trust and Bill & Melinda Gates Foundation have launched their own publishing platforms – Wellcome Open Research and Gates Open Research, respectively – to allow researchers to publish and share their results rapidly.
Other efforts focused on data include the European Commission, which aims to build an open research platform “European Open Science Cloud” that can act as a virtual repository of research data of publicly funded studies and allow European researchers to store, process, and access research data.
In addition, the Preclinical Data Forum Network has been working toward building a data exchange and information repository and incentivizing the publication of negative data by issuing the world’s first price for published “negative” scientific results.
Journals have also taken various initiatives to allow open access of their publications. Some journals such as Nature and PLOS ONE require data availability statements to be submitted by researchers to help in locating the data, and accessing details for primary large-scale data, through details of repositories and digital object identifiers or accession numbers.
Journals also advise authors to upload their raw and metadata in appropriate repositories. Some journals have created their separate cloud-based repository, in addition to those publicly available. For instance, Elsevier has created Mendeley Data to help researchers manage, share, and showcase their research data. And, JCB has established JCB DataViewer, a cross-platform repository for storing large amounts of raw imaging and gel data, for its published manuscripts. Elsevier has also partnered with platforms such as Scholix and FORCE11, which allows data citation, encouraging reuse of research data, and enabling reproducibility of published research.