ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

Taxonomy of interventions at academic institutions to improve research quality

[version 1; peer review: 2 not approved]
PUBLISHED 05 Aug 2024
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

This article is included in the Research on Research, Policy & Culture gateway.

Abstract

Background

Research waste has become an increasing issue for research institutions and researchers due to poor research reproducibility and replicability. Interventions to improve research quality at research institutions are important to reduce research waste. This review aims to identify and classify possible interventions to improve research quality, reduce waste, and improve reproducibility and replicability within research-performing institutions.

Methods

Steps to develop the taxonomy were 1) Use an exemplar paper looking at journal-level interventions to improve research quality, 2) Adapt intervention titles to align with Michie’s behaviour change wheel, 3) Conduct a 2-stage search in PubMed using seed articles and reviews, and a forward and backward citation search to identify articles that evaluated or described the implementation of interventions to improve research quality, 4) Pilot draft taxonomy with researchers at an openscience conference workshop, and 5) Iterative drafting and revisions by the research team.

Results

Overall, 93 individual interventions were identified through the peer-review literature and researcher reporting. Eleven peer-reviewed articles were identified. Interventions identified covered research stages from before, during, and after study conduct, and whole of institution. Types of intervention included: Tools, Education & Training, Incentives, Modelling and Mentoring, Review & Feedback, Expert involvement, and Policies & Procedures. The taxonomy identified areas for research institutions to focus on to improve research quality, reproducibility, and replicability.

Conclusions

Areas of focus and future research include improving incentives to implement quality research practices, evaluating current interventions, encouraging no- or low-cost and high-benefit interventions, examining institution culture and individual research ethos, and encouraging researcher mentor-mentee relationships.

Keywords

Academies and Institutes, research quality, reproducibility, reproducibility of results, replicability, taxonomy

Introduction

Over the past decade, the problems of research waste and the reproducibility crisis have been extensively documented.14 In 2014, a 5-part series in the Lancet found that approximately 85% of biomedical research goes to waste through the combination of poor study design, non-publication, and poor reporting,1 with a similar percentage recently reported for ecology research.5 Studies in disciplines as diverse as economics, cancer biology, psychology, machine learning, ecology, and social sciences have found disappointingly low reproducibility and replicability.14

Low reproducibility means that original protocol, materials or data sets may not be available to conduct analysis to reproduce the results; low replicability relates to the inability to re-conduct or re-conduct well an entire study or experiment, regardless of whether the results replicate.6 The two issues, reproducibility and replicability, exist on a spectrum from ‘direct’ that stritly follow the original methods, to ‘conceptual’ where researchers may selectively alter aspects of the original methods to test for robustness and generalisability.7,8 Both are important to reducing research waste and improving overall research practice quality.

Poor research reproducibility and replicability is partly attributable to flaws in study design and partly to incomplete or poor documentation of research processes. The flow-on effects impact research users such as industry. Many of these problems are avoidable and might be reduced with sustained interventions by research institutions. Some key stakeholders in improving the research system to improve quality are researchers, journals, and research funders, as well as research institutions. While this paper focuses on research institutions, the findings are relevant to the other stakeholders.

What might research institutions do to improve the quality and reproducibility of their research? This work builds on a taxonomy of interventions for journals and publishers developed by Blanco et al in their scoping review of interventions to improve adherence to reporting guidelines in health research, which classifies interventions by the type of intervention and by the research stage.9 The current taxonomy expands the behaviour change categories used by Blanco, drawing on Michie’s behaviour change wheel which covers Training, Incentivisation, Modelling, Persuasion, Education, and Coercion. This taxonomy is useful, not just for identifying existing interventions, but also to identify where there are gaps. The taxonomy of interventions will serve several purposes: (i) to assist with search and classification in scoping reviews, (ii) to enable systematic reviews on specific topics – which may be a cell, row, or column of the taxonomy table, and (iii) to help prioritize the development and evaluation of additional interventions in institutions. This review aims to develop a taxonomy of possible interventions to improve research quality, reduce waste, and improve reproducibility and replicability within research-performing institutions. To assist this process, we first aimed to identify and classify published and unpublished examples, including studies that assessed the interventions.

Methods

A preprint of this paper is available on bioRxiv: https://www.biorxiv.org/content/10.1101/2022.12.08.519666v1.10 Research institutions were inclusive of academic institutions such as universities, government research institutes, and privately funded institutions. Within institutions, interventions could occur at a range of levels, from actions by individuals to actions by departments and whole institutions, including policy changes.

The interventions could be training or education, institutional incentives or regulations, or provision of infrastructure and tools; the only requirement was that the intervention must be aimed at some aspect of reducing research waste, improving quality, or improving reproducibility. For example, interventions aimed at better study design or better conduct of research, increased or timelier publication of research, better reporting of research, including better ‘open science’ such as the provision of protocols and other research process details, and research data would all be in scope.

The search

Because the potential range of potential interventions and terms used was broad and unknown, we used a 2-stage process for the search. Stage 1 used a set of seed articles and reviews identified by the authors from a preliminary search which identified several articles including a review of journal interventions. We then used a forward and backward citations search of this set of articles to widen the pool of potential articles. Stage 2 then conducted a word frequency analysis on these eligible articles to identify key terms to build a search strategy for the full database searches.

This Stage 1 search identified a key review article on interventions to improve adherence to reporting guidelines for journals, which included a suggested taxonomy.9 We then drafted an initial taxonomy and used the other interventions identified from the searches to test and modify this proposed intervention taxonomy.

Participatory design stage

In this next stage, we sought input from others regarding the taxonomy and for further examples of interventions. To ensure the taxonomy was reflective of research practice in institutions, we invited possible end-users to assist in co-development using aspects of participatory design research.11 During the 2021 Association for Interdisciplinary Meta-Research and Open Science Conference (https://www.ivvy.com.au/event/aimos2021), we held a workshop with approximately 40 participants to further refine the draft taxonomy. Detail on the audience involvement in the development of the taxonomy was outlined in the conference schedule, and attendees were informed that their contributions would add to the development of the taxonomy. Workshop participants included ~30 researchers at different career levels, ranging from PhD students to professors.

Briefly, the steps of the workshop process were:

  • 1. List any interventions you have conducted, attended, or heard of.

  • 2. Map these interventions onto the taxonomy, using a Google Doc accessible to all participants (Note: if they do not fit, then put them into a second list)

  • 3. Discussion of interventions that do not fit the proposed taxonomy (do these warrant a change to the taxonomy?)

  • 4. General discussion on next steps (reviewing and applying feedback and opportunity to contribute in future rounds of revision and authorship).

Following the workshop, we used the participant input to develop the revised taxonomy, collect further potential examples, and revise the taxonomy again. In the following year, the taxonomy was presented at the 2022 Association for Interdisciplinary Meta-Research and Open Science Conference (https://aimos.community/aimos2022), where authors asked for any final feedback. No conference attendees responded to this. Ethics approval was not required as the process involved negligible risk, and the de-identified workshop participants were all sent a draft copy and invited to be authors on the manuscript. No workshop participants responded to the invitation.

Results

The taxonomy

Interventions were first classified according to the research stage of their implementation: before study conduct, during study conduct, and after study conduct. Research stages were then further subclassified into education, grant writing, protocol writing, research conduct & analysis, manuscript writing, manuscript submission, or post-publication. Table 1 highlights which type of behaviour change interventions, as classified by Michie’s behaviour change wheel, are represented at each research stage.

Table 1. Outline of classification of interventions and their relationship to research stage – condensed version of taxonomy.

WHOLE OF INSTITUTIONRESEARCH STAGE
BEFORE STUDY CONDUCTDURING STUDY CONDUCTAFTER STUDY CONDUCT
EDUCATIONGRANT WRITINGPROTOCOL WRITINGRESEARCH CONDUCT & ANALYSISMANUSCRIPT WRITINGMANUSCRIPT SUBMISSIONPOST-PUBLICATION
UNDERLYING MECHANISMS OF INTERVENTION, including Institutional Culture and Individual EthosTYPE OF INTERVENTIONTOOLS (20) (Enablement)Availability of open source and reproducible software packagesPeer-to-peer tool sharingBoilerplate languageProvide study design specific protocol templatesShared version control repositories.Author and contributor unique identifiers e.g., ORCIDJournal management system elicitation of registration and other quality indicators
EDUCATION AND TRAINING in research quality and reproducibility (Training)Department or staff within the institution dedicated to research quality and reproducibility interventions and activitiesTraining on systematic literature searchesPersonalised, tailored support e.g., for statistical supportTraining on use of reporting guidelines including protocols and registrationTrain research assistants, etc about good data collection practicesTraining on writing tools, reporting guidelines and software (21-23)Training on submission process, including accessing funds for publication feesTraining on presentations - oral and poster for conferences and research seminars with different modes: F2F, online live and pre-recorded
INCENTIVES to enhance AWARENESS, ACCESSIBILITY & UNDERSTANDING (Incentivisation)Hiring and promotion criteria that include open science practices;Awarding small grants/prizes for adhering to best methodological practiceInclude code/data sharing in promotion criteria
MODELLING AND MENTORING to encourage quality and reproducibility (Modelling)Create research teams with effective mix of research expertiseMentor/mentee partnershipsEncouraging researchers to apply for grants where the Registered Report is linked to a funder and a journalUse of DevOps practices for research software and analysis developmentEncouragement of protocol publicationModel use of social media for dissemination
REVIEW & FEEDBACK (Persuasion)Education for ECRs on how to conduct peer-review (22)Peer-review of proposals and protocols (24)Peer-review of protocols‘Living research’ analyses in articles can be shared in a ‘sandbox’ computing environmentPre-submission peer-review (24) and code reviewPost-publication peer-review
EXPERT involvement and advice (Education)Specific hiring for people with experience of open research, data stewards, etc. and/or training those currently employed to do this.Availability of peers and colleagues to assist one another in research quality improvementEngaging with external consulting organisationsLibrarian involvement for literature reviews e.g., search strategiesDedicated data championWriting support for manuscripts (6)Publications officer to check adherence of paper to reporting guidelinesDissemination to end-users
POLICIES & PROCEDURES (Coercion)Open science curriculum for under- and post-graduatesSeed grants to refine 'near miss' grant application which meet quality criteriaMandate study registrationRequirement for data management plans and integrity checksPolicies for authorship, reporting checklists, and appropriate journal listsData sharing policiesRandom audits of research output

‘Whole of institution’ was included as an additional category, separate to the research stages, as some interventions relate to two or more stages of research or support overall research practices in that institution. Similarly, ‘Institutional Culture and Individual Ethos’ was added to the taxonomy to highlight the influence of the culture of the institution including their overall research aims and mission, and those that work in the institution and their individual ethos, values, and attitudes towards research practices.

Table 1 gives examples of the interventions identified. The full set of interventions are displayed at the Open Science Framework website (“Taxonomy of interventions at academic institutions to improve research quality - Full Taxonomy Table and Example Interventions” Overall, we identified 93 different possible interventions).

The types of interventions varied widely, from whole of institution policies – such as modifying hiring and promotion criteria to emphasise rigorous research design, reproducibility, and transparency, to highly specific departmental-level interventions such as developing mentor-mentee relationships. Most interventions are applicable to researchers at all levels of experience. Several interventions are specific to particular areas of research e.g., registration of clinical trials in healthcare, but others, such as mentoring, or journal clubs are relevant to multiple disciplines. We did not subclassify by disciplines.

In reviewing the taxonomy, several themes emerged. Many of the interventions require a substantial and long-term investment in people– e.g., hiring of specific experts; training of research assistants and others on data collection methods and techniques; co-design with patients and public/end-users. Though ad hoc seminars have value, most of the interventions require individuals or teams or to be embedded in institutions, even if the intervention is to provide ‘just in time’ advice. For example, a publications officer to check adherence of papers to reporting guidelines might have to be well established for them to be able to provide on the spot advice at a time of need.

Education of researchers and research support staff can happen by a variety of formal and informal methods. There was some suggestion that some of the training had to be compulsory e.g., included in the curriculum for undergraduate and postgraduate research training. However, there was also specific recognition of the role of informal networks including peer-to-peer learning and mentor-mentee relationships. We note that mentor-mentee learnings can be in both directions, as more junior staff are sometimes the instigators of novel research practices learned during their research skill development. As training in undergraduate and postgraduate programs are constantly changing, more experienced researchers can be exposed to this by mentoring a student or Early Career Researcher.

There were surprisingly few technical interventions suggested. Most of these also included an element of human intervention e.g., use of pull-requests and code commentary by collaborators and/or external peers on shared codebases. Notably one of the technical interventions was to cease subsidising (through purchase of site licenses) the use of certain software programs (e.g., statistical and spreadsheet) that are not conducive to reproducibility, and to instead promote and encourage the use of open science source software and practices.

Surprisingly, the ‘incentives’ row has more empty cells than the other rows. This indicates either a lack of awareness of participants involved in reviewing the taxonomy, and/or a lack of incentives being implemented and available at research institutions to encourage researchers to participate in quality research practices.

Following the classification, we searched for papers that had described and/or assessed these interventions. During the search processes, eleven articles evaluating interventions were found. All the interventions that had been assessed were in the manuscript and grant writing or education phase of research (Table 2).

Table 2. All primary literature found on interventions that aim to improve research quality and reproducibility at the institutional level.

Author, year
Citation
PopulationAimType of interventionResearch phase
Barnes, 201520Masters and doctoral students in public health and medical researchTo evaluate the impact of an online writing aid tool on the completeness of reporting of two-arm parallel-group RCTs evaluating pharmacologic and non-pharmacologic interventionsTrainingManuscript writing
Böschen, 202121APA full-text journalsEvaluation of JATSdecoder as an automated tool to facilitate checking of reported statistical results for consistency and completenessToolsManuscript writing
Burns, 201422Authors and reviewers of the Canadian Critical Care Trial group for grants and manuscriptsTo formally evaluate authors’ and reviewers’ perceptions of internal peer review before journal submissionReview & FeedbackManuscript Writing
Chauvin, 201923ECRs (although at the journal-level, is translatable?)To evaluate the accuracy in identifying inadequate reporting in RCT reports by early career researchers (ECRs) using an online CONSORT-based peer-review tool (COBPeer) versus the usual peer-review process.Training
Review & Feedback
Manuscript writing
Education
Gattrell, 201624Authors of RCTsTo examine the relationship between medical writing support and quality and timeliness of reporting of randomised controlled trial resultsExpert Involvement and AdviceManuscript writing
Hawwash, 201925Doctoral and postdoctoral researchersTo assess the intention to use a Writing Aid software, which integrates four research reporting guidelines (Consolidated Standards of Reporting Trials, Preferred Reporting Items for Systematic Reviews and Meta- Analyses, Strengthening the Reporting of Observational Studies in Epidemiology and Strengthening the Reporting of Observational Studies in Epidemiology- nutritional epidemiology) and their Elaboration & Explanation (E&E) documents during the write- up of research in Microsoft Word compared with current practicesTrainingManuscript writing
Hirschey, 201926Advance Practice Nurses doing a Doctor of Nursing Practice program.To enhance APNs’ writing skills with a series of online modules, a workshop, and manuscript checklist.TrainingManuscript writing
Nuijten, 202027Full text of APA journalsTo describe the statcheck tool and provide an example of use in a meta-analysisToolsManuscript writing
Shanahan, 201728Authors of speciality medical research journalsTo investigate whether a decision tree tool made available during the submission process facilitates author identification of the relevant reporting guideline.Policy & ProceduresManuscript Writing
Struthers, 202129Authors submitting to BMJ OpenTo provide an outline of the reporting guideline identification tool, GoodReports.org, and to describe user experience and behaviour of using the tool inside and outside of manuscript submission to a journal.ToolsManuscript Writing
Toelch, 201830University-level research course studentsTo evaluate an introductory digital tools course that guides students towards a reproducible science workflow – including research transparency and reproducibility.Policy & ProceduresEducation

Discussion

To improve institutional interventions that might improve research quality, an understanding of the range and types of interventions is vital. Based on work from Blanco et al (2019) in their scoping review of interventions to improve adherence to reporting guidelines in health research we have developed a taxonomy of possible interventions to improve research quality and reproducibility within institutions. At the institutional level, interventions are possible at all stages of research and for each stage of research there are several possible interventions. In this section, we relate key lessons from the taxonomy development for the Australian research context, and then touch on the broader global research environment.

Through an iterative crowdsourced process, we identified interventions that the authors or the hackathon participants had experienced, that they had seen conducted in their or others’ institutions, or which they wanted to see. Very few of the interventions have been evaluated. In several areas where interventions are possible none were identified, or the interventions suggested are only aspirational at this point.

A recent Australian Chief Scientist declared a need to ‘shift from quantity to quality’ and to challenge the status quo of ‘a passive apprenticeship system’ of researcher training.12,13 Research quality has become a much-discussed topic in Australia, and internationally, but there is no systematic approach to improving research quality, especially regarding what interventions are needed at institutions. In Australia, research quality is most prominently assessed through the Excellence in Research for Australia (ERA) process. However, ERA assesses research outputs, not any part of the research process. An increased national focus on research quality more widely through the work of the National Health and Medical Research Council’s (NHMRC) Research Quality Steering Committee (RQSC) established in 2018. In 2019, the RQSC oversaw a survey of Australian Research institutions and researchers.14 Key opportunities identified from that survey relevant to interventions at institutions were the need for effective training and mentorship (especially of junior researchers) about responsible research practice; addressing factors that adversely affect research quality, such as poor research practices; promoting positive initiatives and processes rather than competition where possible; and encouraging more rigorous reproducibility procedures.

In Australia’s recently released National Research Infrastructure RoadMap, the importance of research quality is recognised in specific, limited areas, primarily data e.g., ‘An important driver for maintaining quality research output is Australia’s ability to generate and analyse data as well as improving the digital skills of researchers’.14,15 On a global level, the United Nations Educational, Scientific and Cultural Organization (UNESCO), developed their Recommendation on Open Science in 2021.16 Their recommendations align with the taxonomy in this paper as they recommend interventions related to institutions, including open scientific publications, research data, educational resources, source software and source code, and hardware. Open scientific publications relate to the ‘Education and Training’ – ‘Manuscript Submission’ cell – training on how to identify open access journals, and access funds for publication fees if necessary. Open research data relates to many of the examples provided in the ‘Research Conduct & Analysis’ column of the taxonomy, including shared version control repositories, and data dictionaries. Open educational resources includes the examples in the ‘Education and Training’ row of the taxonomy, including the availability of training sessions to have hybrid delivery. Open source software and code is included in the ‘Tools’ row. Lastly, Open hardware relates to ‘training manuals/data collection protocol, including use of equipment’ in the ‘Research Conduct & Analysis’ – ‘Education and Training’ cell.

There have been other classifications of potential interventions, the Michie behaviour change wheel and the Nosek pyramid.3,17 Both of those classifications align with ours in that they range from interventions that are simply a change in the environment – our ‘tools’, Nosek’s ‘make it easy’, Michie’s ‘enablement’ through to required actions - our ‘policies and procedures’, Nosek’s ‘make it required’, Michie’s ‘coercion’. What the classifications all demonstrate is the need for a range of approaches. By mapping interventions to specific research stage and interventions type, we have demonstrated the range of possible interventions, where there are gaps and especially the relative lack of assessment of these interventions.

Limitations

The interventions in the taxonomy we present are not a comprehensive list of all possible interventions. We did not assess adherence to any of the interventions or examine their effectiveness, except where there were previously published papers. However, a recent review protocol has been published by a group of authors who aim to identify such interventions that investigate research reproducibility and replicability.18 As the participants and authors were largely or exclusively from the sciences, the list of interventions may not include interventions within Humanities and Social Sciences.

A strength of this research is the use of both published peer-reviewed literature and end-user engagement to develop the taxonomy.

Future directions

Given most interventions outlined in the taxonomy have not been evaluated for their impact on research quality and reproducibility, there is a clear need for more institutional interventions be evaluated. Priority areas for evaluation should be those currently in common use at institutions, to assess their value. Implementation of new or different interventions could be those that are no- or low-cost, such as open access tools and software to enhance research practices, e.g., Overleaf, and JASP, and adaptation of policies and the research environment to promote open science practices.

Institution culture and individual researcher ethos have a strong influence over the reproducibility, quality, and transferability of research practices. The UK Reproducibility Network encourages institutions to examine their research culture and how it may or may not be supportive of producing robust and credible research.19 The implementation and evaluation of interventions outlined in our taxonomy should be considered along with the institution’s current culture and potential shifts that could be made to encourage and promote open science practices.

Finally, it is vital to explore the paucity of ‘incentive’ interventions that were identified. Incentivisation is important in workload models of research institutions, much like universities have incentives for their education and teaching of degrees and coursework, they need incentives for research quality. The kind of incentives depend heavily on the institutional structures and availability of resources to create or fund incentives. It is recommended that future research could be guided by this taxonomy further to identify how incentivisation of quality research practices could be better implemented.

Ethics and consent

Ethics approval was not applicable for this research. Implied consent was obtained when participants attended the workshop, where the conference workshop description detailed the use of knowledge created for research purposes, and this was verbally confirmed by authors in the workshop.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 05 Aug 2024
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
R Davidson A, Barbour V, Nakagawa S et al. Taxonomy of interventions at academic institutions to improve research quality [version 1; peer review: 2 not approved]. F1000Research 2024, 13:883 (https://doi.org/10.12688/f1000research.150129.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 05 Aug 2024
Views
17
Cite
Reviewer Report 26 Aug 2024
Helen Niemeyer, Freie Universität Berlin, Berlin, Germany 
Not Approved
VIEWS 17
Thank you for inviting me to review this manuscript on a very important topic. However, I have a number of concerns and would be glad if the authors could address them:
There is no reference for Michie’s behaviour change ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Niemeyer H. Reviewer Report For: Taxonomy of interventions at academic institutions to improve research quality [version 1; peer review: 2 not approved]. F1000Research 2024, 13:883 (https://doi.org/10.5256/f1000research.164669.r311817)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
21
Cite
Reviewer Report 22 Aug 2024
Charlotte R. Pennington, School of Psychology, College of Health and Life Sciences, Aston University, Birmingham, UK 
Not Approved
VIEWS 21
Overview and main concerns:
This paper aims to provide a taxonomy of possible interventions to improve research quality, reduce waste, and improve reproducibility and replicability within research-performing institutions. The authors state that, through an iterative process of 5 ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Pennington CR. Reviewer Report For: Taxonomy of interventions at academic institutions to improve research quality [version 1; peer review: 2 not approved]. F1000Research 2024, 13:883 (https://doi.org/10.5256/f1000research.164669.r311811)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 05 Aug 2024
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.