Announcement: Towards greater reproducibility for life-sciences research in Nature
Since 2013, Nature and the Nature research journals have asked authors of papers in the life sciences to complete a checklist when they submit a paper. This extra step — prompting authors to disclose important elements of experimental design and analysis — was part of a broader effort to improve the quality of reporting in our life-sciences articles.
This week we go further. Alongside every life-sciences manuscript, we will publish a new reporting-summary document, to which authors will now be expected to add details of experimental design, reagents and analysis. This is another step in encouraging transparency, in ensuring that papers contain sufficient methodological detail, and in improving statistics reviewing and reporting.
We expect that the new reporting summary will assist reviewers and editors in assessing experimental quality and help readers to locate crucial details on data collection and analysis. Those familiar with the original checklist will find similar elements in the new reporting summary. The summary also has a strong focus on points that are known to be sources of experimental variability and that tend to be poorly reported in the literature.
Nature has long been interested in promoting the reproducibility of published results. Although the issues that give rise to the ‘reproducibility crisis’ are multifaceted, and come into play long before a paper is submitted, our responsibility is to ensure that the research we publish is well planned, well executed and well reported.
It is not possible to capture the diversity of work across the life sciences in a single document. So this new general-reporting summary will be accompanied by more-specific and more-detailed accessory reporting summaries. These can cover greater experimental detail for papers based on chromatin immunoprecipitation sequencing, flow cytometry and magnetic resonance imaging. Although our physical-sciences papers will not use a standard reporting summary, we are launching accessory summaries on lasers and solar cells to elevate reporting standards in these areas. In future, we will expand this set to cover other techniques. Like the core reporting summary, these accessory summaries will be published with the relevant paper.
We are happy for other journals and institutions to use the same approach, and so we have made all the reporting-summary templates available for use or adaptation under a CC-BY licence.
As with the initial checklist, these documents aim to improve reporting, rather than to enforce a defined set of standards. They should make apparent the details of how a study was designed, performed and analysed, to allow reviewers and readers to interpret the results and understand any limitations. There are, of course, separate experimental standards that must be met to comply with our editorial policies, and these are captured in our new editorial-policy checklist.
As a complement to these new documents, we will now mandate greater transparency in data presentation. We will ask authors, where possible, not to use bar graphs, and instead to use approaches that present full data distribution. We’ve also expanded our data-deposition mandates to include proteomics data, and our policy on reporting of cell-line authentication is being extended to all papers with data from cell lines. And we have added a reporting table for cryo-electron microscopy, which joins those for nuclear magnetic resonance imaging and X-ray crystallography.
With these and other steps, we will continue to work to ensure the rigour of the work we publish, and to promote the ability of the community to build on this research. But journals can only do so much. Institutions must invest greater resources in training scientists in scientific rigour and statistics. Funders must enforce appropriate experimental design from the earliest stages of scientific projects. And the community must appreciate the importance of transparency and replication. Only by working together can we all improve research reproducibility.
Nature 546, 8 (01 June 2017) doi:10.1038/546008a