ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Opinion Article

New forms of checks and balances are needed to improve research integrity

[version 1; peer review: 2 approved, 1 not approved]
PUBLISHED 28 May 2014
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

This article is included in the Research on Research, Policy & Culture gateway.

Abstract

Recent attempts at replicating highly-cited peer-reviewed studies demonstrate that the “reproducibility crisis” is indeed upon us. However, punitive measures against individuals committing research misconduct are neither sufficient nor useful because this is a systemic issue stemming from a lack of positive incentive. As an alternative approach, here we propose a system of checks and balances for the publishing process that involves 1) technical review of methodology by publishers, and 2) incentivizing direct replication of key experimental results. Together, these actions will help restore the self-correcting nature of scientific discovery.

Introduction

The scientific method provides a systematic framework for formulating, testing and refining hypotheses. By definition, it requires findings to be reliable so that theories can be refined and scientific progress can occur. Recently, it has become clear that the scientific method as it is currently being practiced is failing in self-correction, with multiple studies indicating that more than 70% of surveyed peer-reviewed articles cannot be independently verified14. Unfortunately, instead of focusing on new systems to promote high quality reproducible research, most resources and attention are focused on trying to police the scientific community by investigating allegations of research misconduct. This approach is destined to fail, because the problem is systemic and not caused by a few bad players who can be caught and punished. From 1994–2003, 259 cases of misconduct were formally investigated by the Office of Research Integrity5. In contrast, ~480,000 papers funded by the NIH were published6. It would be impractical and ineffective to investigate why 70% of published findings are irreproducible, even though ultimately the ability to repeat and build upon prior work is the key component of research integrity that we should care about. Instead, truly addressing the “reproducibility crisis” requires establishing new checks and balances for the publishing process through 1) technical review of methodology by publishers, and 2) incentivizing direct replication of key experimental results. If we, the scientific community, fail to ensure the quality of the research we produce, other parties with their own vested interests will step in to police us instead7.

1. Checks: Publishers need to verify quality of research through third-party technical review

Publishers are uniquely placed to significantly improve reproducibility because of their inherent need to garner respect from the scientific community. Nature and EMBO are two stand-out examples who are leading the way on ensuring the quality of the research published in their journals. Moreover, current efforts to ensure quality using peer-review alone to weed out irreproducible research are not effective. One reason is that the breadth of technical knowledge that is now required to review a single study is beyond individual scientists. The number of authors per article has increased over the last decade8. In contrast, peer review still relies on two or three peers who are unlikely to be qualified to assess every experimental technique in the study. Nature has implemented an impressive new policy to reduce irreproducibility of its published papers9, and a key aspect to this is employing expert statisticians to review the statistical analysis of papers. Currently, a major limiting factor for implementing technical review is the lack of standardization for methodology design and required controls. Establishing and implementing these standards to ensure the technical quality of the research published in their journals is an effective value-added service that publishers should provide as a separate power in the scientific community. The Resource Identification Initiative (https://www.force11.org/node/4463 date accessed: 2014-04-24) is an example of practical implementation for reporting of materials and methods in a standardized and machine-readable manner. Similar to successful mandates on open access to raw data, journals wield the power to require clear methodology as prerequisite for publication. Further, analogous to open data, the nascent implementation of standardized methodologies will likely yield debates, but lively discussions by the scientific community are useful for policy refinement (http://blogs.plos.org/everyone/2014/03/08/plos-new-data-policy-public-access-data/ date accessed: 2014-04-25).

2. Balances: Direct replication needs to be incentivized for science to be self-correcting

While journals should carry technical review responsibilities, establishing positive incentive structures for reproducible science is necessary to balance the pressure of producing high-profile publications at all costs. Of course, there will always be edge cases where it is not practical to directly replicate findings (for example unpredictable or one-off events like an earthquake), but for the majority of findings it should be possible to directly replicate them. That is, repeat the experiment as-is, while collecting additional information such as “the reliability of the original results across samples, settings, measures, occasions, or instrumentation10. This is separate from conceptual replication, which is “an attempt to validate the interpretation of the original observation by manipulating or measuring the same conceptual variables using different techniques”10. It is also separate from re-analysis of existing raw data to check for errors in analysis and presentation, but where no new data are obtained. Therefore, directly reproducing experiments is not merely redundant effort, because new data are generated and analyzed to demonstrate the robustness of the original results.

Journals such as F1000Research and PLOS ONE (http://f1000research.com/author-guidelines, http://www.plosone.org/static/publication, date accessed: 2014-03-14) now consider direct replication of original studies, but even a place to publish is not sufficient because there needs to be an effective system to incentivize scientists to conduct replication studies in the first place. The simplest way to conduct replication studies is via fee-for-service technical providers because of their pre-existing methodological expertise and neutral academic involvement (i.e. they are motivated by an operational or a monetary incentive, and thus do not fear retribution from their peers or have the need to accumulate high impact ‘novel’ publications). Similarly, grants specifically designated for research integrity are vital for driving replication (http://www.arnoldfoundation.org/reproducibility-initiative-receives-13m-grant-validate-50-landmark-cancer-studies date accessed: 2014-04-28). These are strategies used by the Reproducibility Initiative (https://www.scienceexchange.com/reproducibility, date accessed: 2014-03-14), and it remains to be proven whether it will be a cost-effective mechanism to conduct direct replications.

The recent ascent of crowd-sourced post publication peer reviews have identified manuscripts with problematic content, but they remain most active for articles on new techniques that other researchers are eager to replicate for their own experiments (e.g. http://www.ipscell.com/stap-new-data/ date accessed: 2014-04-28 and http://f1000research.com/articles/3-102/v1 date accessed: 2014-05-20). Therefore, positively incentivizing direct replication is necessary for science to become self-correcting again, because no one would selectively publish only their experiments that worked or manipulate their findings knowing that a replication attempt, whether experimental or analytical, would not find the same significant outcome. Scientists would also be more willing to share their raw data and full methodologies before publishing because they want to make sure that their findings are reproducible. Not identifying robust and reproducible research is very costly and impairs our ability to make effective progress against diseases like cancer in which we have already invested billions of dollars. Establishing new checks and balances with existing members of the scientific community such as publishers and fellow scientists is infinitely more preferable than those imposed by outside authorities. And if science progresses by “standing on the shoulders of giants”, it is our duty as scientists to ensure that the “shoulders” are steadfast for our peers.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 28 May 2014
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Iorns E and Chong C. New forms of checks and balances are needed to improve research integrity [version 1; peer review: 2 approved, 1 not approved]. F1000Research 2014, 3:119 (https://doi.org/10.12688/f1000research.3714.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 28 May 2014
Views
101
Cite
Reviewer Report 18 Jun 2014
David Soll, Department of Biology, University of Iowa, Iowa City, IA, USA 
Not Approved
VIEWS 101
Iorns and Chong state in the first paragraph of their Opinion Article that “70% of surveyed peer-reviewed articles cannot be independently verified”. Iorns, who heads the company, Science Exchange, Inc., reported the same statistic in an interview with Jennifer Welsh ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Soll D. Reviewer Report For: New forms of checks and balances are needed to improve research integrity [version 1; peer review: 2 approved, 1 not approved]. F1000Research 2014, 3:119 (https://doi.org/10.5256/f1000research.3980.r4918)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
51
Cite
Reviewer Report 16 Jun 2014
Ivan Oransky, Department of Science, Health and Environmental Reporting, New York University, New York, NY, USA;  Retraction Watch, New York, NY, USA 
Approved
VIEWS 51
Thank you for the opportunity to review this article. It makes an important argument in a critical area of inquiry, and deserves publication.

I have some specific suggestions for improvement below:
  1. "...punitive measures against individuals committing research misconduct are neither sufficient nor
... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Oransky I. Reviewer Report For: New forms of checks and balances are needed to improve research integrity [version 1; peer review: 2 approved, 1 not approved]. F1000Research 2014, 3:119 (https://doi.org/10.5256/f1000research.3980.r4917)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
47
Cite
Reviewer Report 10 Jun 2014
Andrew D. Chalmers, Department of Biology and Biochemistry, University of Bath, Bath, UK 
Approved
VIEWS 47
Improving reproducibility is a key challenge and topical area in the life sciences. The submitted manuscript provides a well written and interesting commentary on the topic and suggests two key approaches to improve reproducibility, based on technical review and incentivizing ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Chalmers AD. Reviewer Report For: New forms of checks and balances are needed to improve research integrity [version 1; peer review: 2 approved, 1 not approved]. F1000Research 2014, 3:119 (https://doi.org/10.5256/f1000research.3980.r4922)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 28 May 2014
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.