Introduction
An essential part of the scientific method is that researchers can repeat the experiments of others and test the outcomes themselves. To achieve this requires accurate reporting not just of the results of those experiments but also of the methods that underpin them. However, as science becomes more technology-driven, the equipment used is more specialised, the data generated is harder to represent in traditional media, and reporting how experiments were performed so that independent researchers can repeat them gets progressively harder. Reproducibility in science is a hot topic and a concerning one; indeed, several commentators have concluded that fallibilities in the way that research investigations are currently conducted, and how their results are disseminated via article publication have become detrimental to the scientific process1–4. The difficulties in ensuring reproducibility are multi-faceted: the problems are systemic. Policy makers, funding agencies, academic institutions, scientific publishers, scientists themselves and the vehicles through which they publish each contribute to a complicated web of issues that conspire against the publication of reproducible results5. Various measures have been proposed to try to combat these problems, ranging from top-down strategies through government initiatives6, to bottom-up strategies such as providing checks and balances for research integrity during the publishing process7. Measures like this tend to come with their own problems and, in some cases, can provide further barriers to reproducibility8.
One way in which reproducibility issues can be tackled is through the implementation of open science and open data practices9,10. As attendees of the AllBio: Open Science & Reproducibility Best Practice Workshop, we discussed how principles of open science could be instilled into the current research workflow; as part of this debate, we tried to identify ways in which reproducibility might be improved.
One route into this workflow is through the peer review process. Peer review is an important gatekeeper and a key part of scientific discourse. Before any research findings can be formally accepted, they must be evaluated and commented upon by peers (experts in their fields), who then provide advice about the quality or validity of the work to Editors, or in the case of open peer review and post-publication invited peer review systems, to the readers themselves. Importantly, peer review happens at a personal rather than institutional level and is carried out by individuals; it is therefore an ideal mechanism for getting a message across to the majority of researchers given everyone peer reviews or is peer reviewed. Of course, the peer-review process is not infallible11,12. The issues are many and varied, including the time available to perform thorough reviews, reviewers’ expertise, journals’ perception of relevance/interest/impact, and so on. Arguably, one of the most significant problems – certainly the one that generates most friction – is that reviewers can safely dispense self-serving and biased critiques, fully protected by the mask of anonymity.
Scientists have become sufficiently frustrated by these issues to devise ad hoc solutions to help safeguard the quality of reviews and allow reviewers to affirm that they will review in an ethical and professional way, and encourage clearer review processes. This has led to the articulation of various forms of reviewer’s oath (e.g. 13–15). It is these that inspired us. Building on this work, we have formulated an oath that codifies the role of reviewers in helping to ensure that the science they review is sufficiently open and reproducible; it includes guidelines not just on how to review professionally, but also on how to support transparent, reproducible and responsible research, while optimising its societal impact and maximising its visibility. We suggest a mode of constructive dialogue between respectful individuals.
The new oath is accompanied by a manifesto that develops the principles set out in the guidelines, and provides further direction for upholding responsible and interactive reviews, as well as the necessary information for other researchers to reproduce the results. A key tenet is that the oath is not meant to be burdensome or to cause friction between reviewers and authors; in fact, their cooperation could improve the accuracy of reviews16. The goal is to provide a supportive framework for guiding reviewers toward professional and ethical behaviours, and to provide the necessary checks on whether they would be able to reproduce the work. If the issue of reproducibility can be satisfied at the point of peer review, then published results should be more reliable, and the scientific community can have greater faith that what they read is solid enough to build on.
The Open Science Reviewer’s Oath
The oath is a simple checklist to use when reviewing or considering a review request. We recommend that reviewers add a link to this oath (Box 1) at the top of each review as they begin, in order to provide an aide memoire to open review practice, and to inform the authors and potential publishers of the work of their intentions. We hope that by being explicit about the intent, the review will seem less like a cloak-and-dagger process, it will make constructive criticism easier for the author to receive and for the reviewer to provide, and it will also help to spread the practice of open reviewing.
Box 1. While reviewing this manuscript:
i) I will sign my review in order to be able to have an open dialogue with you
ii) I will be honest at all times
iii) I will state my limits
iv) I will turn down reviews I am not qualified to provide
v) I will not unduly delay the review process
vi) I will not scoop research that I had not planned to do before reading the manuscript
vii) I will be constructive in my criticism
viii) I will treat reviews as scientific discourses
ix) I will encourage discussion, and respond to your and/or editors’ questions
x) I will try to assist in every way I ethically can to provide criticism and praise that is valid, relevant and cognisant of community norms
xi) I will encourage the application of any other open science best practices relevant to my field that would support transparency, reproducibility, re-use and integrity of your research
xii) If your results contradict earlier findings, I will allow them to stand, provided the methodology is sound and you have discussed them in context
xiii) I will check that the data, software code and digital object identifiers are correct, and the models presented are archived, referenced, and accessible
xiv) I will comment on how well you have achieved transparency, in terms of materials and methodology, data and code access, versioning, algorithms, software parameters and standards, such that your experiments can be repeated independently
xv) I will encourage deposition with long-term unrestricted access to the data that underpin the published concept, towards transparency and re-use
xvi) I will encourage central long-term unrestricted access to any software code and support documentation that underpin the published concept, both for reproducibility of results and software availability
xvii) I will remind myself to adhere to this oath by providing a clear statement and link to it in each review I write, hence helping to perpetuate good practice to the authors whose work I review.
The manifesto
Each point of the reviewer’s oath relates to open principles that we consider important; the collection of these principles is the manifesto. The manifesto relates to the oath as follows:
Principle 1: I will sign my name to my review – I will write under my own name
I recognise that reviewing is a role that gives me advantage over you and that anonymity allows abuse of your trust. I will not do this.
Principle 2: I will review with integrity
ii) I will be open and honest at all times
iii) I will state my limits
iv) I will turn down reviews I am not qualified to provide
v) I will not unduly delay the review process
vi) I will not scoop research that I had not planned to do before reading the manuscript
I recognise that integrity is a social act that requires the majority to hold shared convictions; I will use the majority of ‘doves’ to balance the ‘hawks’ in my review by sharing the content.
I will always state the boundaries of my scientific knowledge and practice; I openly acknowledge that I am not an expert in, and cannot satisfactorily assess every aspect of, my field. I will inform you and the journal when this situation arises.
I will not always be an appropriate reviewer. I will provide journal editors with a fair assessment of my ability and, when necessary, decline to review, and will always expand on the reasons.
I will not write a negative review with the intention of blocking publication or delaying publication. In the case where I have already come to the same (or different) conclusions from the author I will state this fact and suggest the possibility of cooperative publication (either back-to-back) or merge a paper.
I understand that there are conflicts in my field. Sometimes, there may be good reasons for remaining anonymous, which may relate to the integrity of others. Wherever possible, I will highlight abuses of integrity and turn down invitations if I feel I have such a direct conflict that would inappropriately affect my review.
Principle 3: I will treat the review as a discourse with you; in particular, I will provide constructive criticism
vii) I will be constructive in my criticism
viii) I will treat reviews as scientific discourses
ix) I will encourage discussion, and respond to your and/or editors’ questions
I will happily engage in conversation with you about your work, providing constructive criticism where appropriate.
Principle 4: I will be an ambassador for good science practice
x) I will try to assist in every way I ethically can to get your manuscript published, by providing criticism and praise that is valid, relevant and cognisant of community norms
xi) I will encourage the application of any other open science best practices relevant to my field that would support transparency, reproducibility, re-use and integrity of your research
xii) If your results contradict earlier findings, I will allow them to stand, provided the methodology is sound and that you have discussed them in context
xiii) I will check that the data, software code and digital object identifiers are correct, and the models presented are archived, referenced, and accessible
xiv) I will comment on how well you have achieved transparency, in terms of materials and methodology, data and code access, versioning, algorithms, software parameters and standards, so that your experiments can be repeated independently
xv) I will encourage deposition with long-term unrestricted access to the data that underpin the published concept, towards transparency and re-use;
xvi) I will encourage central long-term unrestricted access to any software code and support documentation that underpin the published concept, both for reproducibility of results and software availability
I will uphold and advocate open science practice by pointing out where I believe that the authors can do better with respect to deposition of data, citation of accessions and code etc. Often this will mean circumventing current norms.
Principle 5: Support other reviewers
As part of my role as a scientist and an open reviewer, I will help other reviewers when they need guidance or support. I understand that new reviewers may not feel entirely secure in managing the conflicts that often arise from the normal academic process. In these cases I will judge a review on its merit and not the individual who has written it.
Author contributions
Dan Maclean, Ivo Grigorov, Michael Markie, Teresa Attwood, Konrad Förstner, Jean-Karim Heriche and Neil Chue Hong conceived and designed the oath and prepared the first draft of the manuscript. All the other authors in the working group were involved in the revision of the draft manuscript and have agreed to the final content.
Competing interests
MM is currently employed by F1000Research. His role at the journal does not include any involvement in the pre-publication editorial checks, or with the refereeing process.
Grant information
ALLBIO - Broadening the Bioinformatics Infrastructure to unicellular, animal, and plant science, Project reference: 289452, Funded under: FP7-KBBE. We would also like to thank The Genome Analysis Centre (TGAC, Norwich, UK) and the Biotechnology and Biological Sciences Research Council (BBSRC, UK). IG was funded by FP7 FOSTER (Grant 612 425).
Acknowledgements
We would like to thank The Genome Analysis Centre (TGAC, Norwich, UK) for organising and hosting the workshop.
We would also like to thank Peter Murray Rust for comments on the preprint (https://zenodo.org/record/12273) and contributing an additional principle to the oath.
F1000 recommendedReferences
- 1.
Ioannidis JP:
Why most published research findings are false.
PLoS Med.
2005; 2(8): e124. PubMed Abstract
| Publisher Full Text
| Free Full Text
- 2.
Ioannidis JP, Allison DB, Ball CA, et al.:
Repeatability of published microarray gene expression analyses.
Nat Genet.
2009; 41(2): 149–55. PubMed Abstract
| Publisher Full Text
- 3.
Prinz F, Schlange T, Asadullah K:
Believe it or not: how much can we rely on published data on potential drug targets?
Nat Rev Drug Discov.
2011; 10(9): 712. PubMed Abstract
| Publisher Full Text
- 4.
Hines WC, Su Y, Kuhn I, et al.:
Sorting out the FACS: a devil in the details.
Cell Rep.
2014; 6(5): 779–81. PubMed Abstract
| Publisher Full Text
- 5.
Collins FS, Tabak LA:
Policy: NIH plans to enhance reproducibility.
Nature.
2014; 505(7485): 612–3. PubMed Abstract
| Publisher Full Text
| Free Full Text
- 6.
European Commission Responsible Research & Innovation Policy. 2012. Reference Source
- 7.
Iorns E, Chong C:
New forms of checks and balances are needed to improve research integrity [v1; ref status: indexed, http://f1000r.es/32k].
F1000Res.
2014; 3: 119. PubMed Abstract
| Publisher Full Text
| Free Full Text
- 8.
Stodden V:
Changes in the Research Process Must Come From the Scientific Community, not Federal Regulation. 2013. Reference Source
- 9.
Molloy JC:
The Open Knowledge Foundation: open data means better science.
PLoS Biol.
2011; 9(12): e1001195. PubMed Abstract
| Publisher Full Text
| Free Full Text
- 10.
Pereira S, Gibbs RA, McGuire AL:
Open access data sharing in genomic research.
Genes (Basel).
2014; 5(3): 739–747. PubMed Abstract
| Publisher Full Text
| Free Full Text
- 11.
Patel J:
Why training and specialization is needed for peer review: a case study of peer review for randomized controlled trials.
BMC Med.
2014; 12(1): 128. PubMed Abstract
| Publisher Full Text
- 12.
Glen AS:
A New “Golden Rule” for Peer Review?
Bull Ecol Soc Am.
2014; 95(4): 431–434. Publisher Full Text
- 13.
Watson M:
The reviewers oath. 2013. Reference Source
- 14.
Alexander S:
The Peer Reviewer’s Oath. 2014. Reference Source
- 15.
Verger A:
My Reviewer Oath. 2014. Reference Source
- 16.
Leek JT, Taub MA, Pineda FJ:
Cooperation between referees and authors increases peer review accuracy.
PLoS One.
2011; 6(11): e26895. PubMed Abstract
| Publisher Full Text
| Free Full Text
Passive review:
Follows all of Principles 1 through 3, and elements x - xii of Principle 4. Here, the idea of an open review basically is a review that is signed to make sure the process has appropriate integrity.
Active review:
Includes all elements of Passive review, as well as items xiii and xiv of Principle 4:
- xiii) I will check that the data, software code and digital object identifiers are correct, and the models presented are archived, referenced, and accessible
- xiv) I will comment on how well you have achieved transparency, in terms of materials and methodology, data and code access, versioning, algorithms, software parameters and standards, so that your experiments can be repeated independently
The idea of an open review here seems to be that, in addition to ensuring integrity in the review process, there is also extra work being done beyond a standard review, and the person who does such work should be credited for doing so.I am uncertain about where elements xv and xvi, as written:
- xv) I will encourage deposition with long-term unrestricted access to the data that underpin the published concept, towards transparency and re-use;
- xvi) I will encourage central long-term unrestricted access to any software code and support documentation that underpin the published concept, both for reproducibility of results and software availability
would fit. These are neither active nor passive, and as written, they don't match the review function, but are even more active, more a collaboration than a review. I suggest that they be rephrased as:- xv) I will check that the data that underpin the published concept are made available in a manner that provides long-term unrestricted access, towards transparency and re-use;
- xvi) I will check that any software code and support documentation that underpin the published concept are made available in a manner that provides long-term unrestricted access, both for reproducibility of results and software availability
so that they could be part of an Active review.Of course, this specific remedy is just a suggestion, but I the overall point I want to make is that the added work to be done by the reviewer beyond what is now standard needs to be explicitly considered in both the oath itself as well as the description of the oath.
Passive review:
Follows all of Principles 1 through 3, and elements x - xii of Principle 4. Here, the idea of an open review basically is a review that is signed to make sure the process has appropriate integrity.
Active review:
Includes all elements of Passive review, as well as items xiii and xiv of Principle 4:
- xiii) I will check that the data, software code and digital object identifiers are correct, and the models presented are archived, referenced, and accessible
- xiv) I will comment on how well you have achieved transparency, in terms of materials and methodology, data and code access, versioning, algorithms, software parameters and standards, so that your experiments can be repeated independently
The idea of an open review here seems to be that, in addition to ensuring integrity in the review process, there is also extra work being done beyond a standard review, and the person who does such work should be credited for doing so.I am uncertain about where elements xv and xvi, as written:
- xv) I will encourage deposition with long-term unrestricted access to the data that underpin the published concept, towards transparency and re-use;
- xvi) I will encourage central long-term unrestricted access to any software code and support documentation that underpin the published concept, both for reproducibility of results and software availability
would fit. These are neither active nor passive, and as written, they don't match the review function, but are even more active, more a collaboration than a review. I suggest that they be rephrased as:- xv) I will check that the data that underpin the published concept are made available in a manner that provides long-term unrestricted access, towards transparency and re-use;
- xvi) I will check that any software code and support documentation that underpin the published concept are made available in a manner that provides long-term unrestricted access, both for reproducibility of results and software availability
so that they could be part of an Active review.Of course, this specific remedy is just a suggestion, but I the overall point I want to make is that the added work to be done by the reviewer beyond what is now standard needs to be explicitly considered in both the oath itself as well as the description of the oath.