ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

A cross-sectional audit and survey of Open Science and Data Sharing practices at The Montreal Neurological Institute-Hospital

[version 1; peer review: 1 approved with reservations, 1 not approved]
PUBLISHED 18 Oct 2023
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

This article is included in the Research on Research, Policy & Culture gateway.

This article is included in the Data: Use and Reuse collection.

Abstract

Background: Open science is a movement and set of practices to conduct research more transparently. The adoption of open science has been recognized to support innovation, equity, and transparency. The Montreal Neurological Institute-Hospital (Neuro) has committed to becoming an ‘open science’ institute, the first of its kind in Canada. Here we report on an audit of open data practices in Neuro publications and on a survey of Neuro-based researchers’ barriers and facilitators to data sharing.
Methods: In the first study, we retrieved 313 unique publications and collated all Neuro publications from 2019 and extracted information from each article pertaining to data sharing and other open science practices. We included all empirical papers and pre-prints that were reported in English. In the second study, one hundred twenty-four participants (out of 553) completed the survey, with a response rate of 22.42%. We surveyed all Neuro researchers. For the audit, we examined data sharing and open science practices. For the survey, we asked participants questions about their data sharing practices and perceptions.
Results: We found that 66.5% of these publications (n=208) included a data sharing statement. Overall, 74.5% (n=155) of articles had data that was publicly available. When examining broader open science practices, rates of compliance tended to be lower. For example, 94.9% (n=297) of publications failed to register a protocol. Among participants who had published a first or last authored paper in the past year, most participants, 53 of 74 (71.62%), reported that they had openly shared their research data. Less than half of the participants, 37.50% (n=45), reported having engaged in training related to data sharing within the last 12 months.
Conclusion: We found that half of all publications included in the audit shared data. Participants indicated an appetite for resources for learning about data-sharing signaling a willingness to perform better.

Keywords

Open Science, Data Sharing, Audit, Survey, Facilitators and barriers

Introduction

Open science refers to the practices of making research outputs openly available for others to use and build upon. Open science can be seen as an alternative to the typically ‘closed’ model of science which tends to stress concern over intellectual property and controlling access to scientific knowledge and data.1 Benefits to open science include increased transparency, the ability to replicate research, equity in access to research information, reputational gains and increased chances of publication.2 Recognizing the value of open science, many stakeholders have begun to implement policy mandates to see that open science practices are being implemented in the community. This includes the creation of federal roadmaps and policies, organization policies, and funder policies, such as Canada’s Roadmap to Open Science,3 the second French national plan for open science,4 the United Nations Educational, Scientific and Cultural Organization’s open science policy,5 the European Union’s open science policies,6 and others.

Beyond these stakeholders, academic institutions have an important role in implementing open science.1 Despite their seemingly obvious role, academic institutions have tended not to feature in discussions around implementing open science. Indeed, academic institutions have been previously criticized for not playing more of a role in addressing issues of research reproducibility7 which is conceptually related to open science. The reality is that few academic institutions have strong transparent processes in place to encourage open science. Despite this, research institutions are uniquely positioned to help contribute to defining incentives in research and to valuing open science practices. Shifting researchers’ behavior towards openness will require education, training, as well as a culture shift. Research institutions could provide the environment of this training to verify common understanding of open science and to help shift culture. Inaction by research institutions to play their role in implementing open science will have downstream consequences for how research is disseminated.

In Canada, a notable exception is the Montreal Neurological Institute-Hospital (henceforth called the ‘Neuro’) which has committed to becoming an open science institute. Having made the scientific and cultural decision to become ‘open’ after a significant consultation and buy-in process,8 the Neuro is now positioned to implement open research practices. The Neuro has made a structural decision to run a focused implementation program wherein a small set of open science practices are initially focused on, and additional practices then added in an incremental manner.

At present there is a focus on data sharing, namely making the data underlying the results reported in a given publication publicly available for others to use and build upon, or to verify the work reported. The rationale of choosing data sharing from the potential open research practices was: 1. Few researchers have the skills required to share their data and most researchers are not trained in data sharing9; 2. Data sharing has the potential to lead to novel discovery and enhances the transparency of disseminated research findings; 3. The largest government health funder in Canada, Canadian Institutes of Health Research, recently announced new polices on data management.10 By adopting data sharing now, the Neuro stands to be ahead of incoming mandates and may offer insight for others to follow. Finally, while some data sharing infrastructure exists, a significant investment is being made at the federal level to build capacity for Canadian researchers.11,12

This article describes the results of two studies examining data sharing practices at the Neuro. In the first study, we audited all publications produced by Neuro researchers in 2019. In the second study, we surveyed Neuro-based researchers about barriers and facilitators to data sharing. The results will provide the Neuro with a better understanding of barriers and facilitators to data management and sharing and will identify educational needs related to data sharing that can be implemented to address barriers. Further, the audit of publications can be used to benchmark for improvements over time and to monitor change.

Methods

Ethics approval statement

The study received ethical approval from the Ottawa Health Science Research Network Research Ethics Board (OHSN-REB #20210514-01H).

Transparency statement

We used the STROBE guideline to inform our reporting of the audit13 and the CHERRIES guideline to inform reporting of the survey.14

Study 1: A cross-sectional audit of Neuro publications

Search strategy

We identified the Neuro publications produced in 2019 by searching the Web of Science (WoS)15 and using its meta-data to capture the Neuro’s output including searching two preprint servers MedRxiv16 and bioRxiv.17 The search strategy was developed by trained information specialists and librarians (Sanam Ebrahimzadeh and Alex Amar-Zifkin). For the full search strategy please see Extended Data.

Patient and public involvement

Patients or the public were not involved in the design, or conduct, or reporting, or dissemination plans of our research.

Eligibility criteria

We aimed to include all research papers disseminated by researchers based at the Neuro in 2019. We included research papers within any field. We used the last listed date of publication on each paper to determine if the publication fits within this timeframe. We included all research publications irrespective of the role of the Neuro based author (e.g., including trainees, graduate and postdoctoral students, early career researchers and more established researchers). We included all publications where a Neuro-based author was listed, irrespective of where they were named on the author byline. We included publications in English only. We included publications in traditional peer-reviewed journals as well as those on preprint servers.

Screening and extraction

DistillerSr18 was used to manage study records retrieved by the search, this process could also have been accomplished manually, but DistillerSr helped with organization of files. We obtained all full-text documents, and two reviewers independently screened these records against our inclusion criteria for included articles, we then extracted basic epidemiological information from each included study, including the names of the Neuro based author(s), the outlet of publication, and we classified articles based on their study design and content area. In addition, we extracted information related to data sharing practices. This included information on whether the publication contained a data sharing statement, whether or not data sharing occurred, and if so, what format and tools were used for this sharing. In addition, we extracted information on a range of other open science practices as per Table 1.

Table 1. Other open science practices examined in the audit.

Open science practiceDescription
Open accessArticles were published in a format that made them free to access and build upon
Study registrationArticles report having created a time-stamped read only version of the study protocol which was shared prior to data collection
PreprintsArticles were a preprint, or referenced that the publication had a linked preprint
Reporting guidelineArticles reported adhering to a reporting guideline checklist
Reference to a protocolArticles linked to a study protocol (but this protocol was not formally registered)
Ethics approvalArticles indicated ethics approval was obtained/waived (or not required)
Conflict of interestArticles included a statement of conflicts of interest

Data analysis

Once all extracted data were in agreement (i.e. discrepancies between assessors resolved) the complete dataset for all included articles was exported from DistillerSR into SPSS 2819 where data were cleaned. We presented the total number of included articles, and basic descriptive analyses for all items extracted using count data and percentages. Also, we used Unpaywall20 to check the open access status of the articles captured in our sample. To do so, we inserted the DOIs we extracted from the included articles into the Unpaywall tool.

Study 2: A cross-sectional online survey of Neuro-based researchers survey

Sampling

The survey was closed (i.e., only open to those we invited) and was administered using SurveyMonkey software.21

Survey items

After providing informed consent, participants were presented with a 14-item survey (See Extended data). The survey was custom-built but draws on items previously reported by Van Panhuis.22 We presented participants with a series of questions regarding their willingness to share data that were developed using the 14 domains of the Theoretical Domains Framework (TDF) to help structure the survey items pertaining to barriers and facilitators to data management and actual data sharing. The TDF was created to better understand health professional behavior and is an integrated theoretical framework.23 Participants were asked to indicate their level of agreement with each statement on a Likert scale of 1 (strongly disagree) to 5 (strongly agree) was used to allow the participants to express how much they agree or disagree with each item. The Neuro ‘Open Science Grassroots committee’, a group of Neuro-based researchers targeting improvements in open science at the institution and beyond,24 reviewed and provided feedback on the survey which we incorporated. The survey was also piloted by three researchers prior to dissemination for clarity.

Survey recruitment

We included all the Neuro’s currently employed graduate students, postdoctoral candidates, research support staff, and independent investigators. The list of researchers was provided to us by the Neuro’s human resources. Participants were invited to complete the survey by the Director of the Neuro, via email, using a standard recruitment script. Participants were asked to complete informed consent online. The consent form described the aims of the study, specify that data collected would be anonymous, and detail our data management plan which includes making data openly available. Completion of the survey was by as implied consent.

Participants were originally sent the survey on September 20th, 2021, with a standardized reminder sent after one week (September 27th, 2021). Some targeted internal email strategies were also implemented to help maximize the response rate. We closed the survey on November 3rd, 2021.

Data analysis

Data was analyzed using IBM SPSS 28. We also report the completion rate for individual items, and report frequencies and percentages, or means and standard deviation, for each of the survey items. We report comparative data, using a Chi square test, illustrating findings by researcher category and gender.

For open-ended survey questions, we conducted a thematic analysis25 of responses provided. Here, two researchers familiarized themselves with the qualitative responses. Then, each of them independently coded responses. Codes were discussed and iteratively reviewed until there was consensus. Then, key themes were identified, and codes were grouped within these. Themes were defined and described in the results section.

Results

Study 1: A cross-sectional audit of Neuro publications

Sample

We initially identified 623 publications from our search. We then removed 15 duplicates. We screened a total of 608 unique references, 313 publications met our inclusion criteria (See Figure 1). Of the retrieved 623 publications, about half were not research outputs. Epidemiological characteristic of included publications is provided in Table 2.

71e88a32-2799-4c8d-8bfb-63ac2d9a69d0_figure1.gif

Figure 1. Flow diagram for inclusion of records.

Table 2. Epidemiological characteristics of included articles.

Type of dataN%
Is a Neuro based author the corresponding author on the paper?
N=313
Yes16251.8%
No15148.2%
Top three journals that publications were published in those journals.Neurology92.9%
NeuroImage92.9%
PLOS ONE82.6%
BRAIN82.6%
Proceeding of the National academy of science82.6%
What type of data is being reported?
N=313
Human22070.3%
Non-human5718.2%
Other3611.5%
If the origin of data is non-human: What type of data is being used?
N=57
Animals5291.2%
Other58.8%
If the origin of data is human: What type of data is being used?
N=302 (a single publication might have used multiple types of data)
Biological5518.2%
DNA3411.2%
Brain MRI9732.2%
Non-MRI imaging299.6%
Behavioral5518.2%
Sequencing data124.0%
Clinical chart93.0%
Other (Sleep studies, Web-based survey, Review,stem cells, Brain tissue from autopsies, OCT testing, Interview, EEG recordings)113.6%
If the origin of data animal, what type of animal is being used?
N=52
Rat1834.6%
Mice2548.1%
Other (Zebrafish, worm, Pig, monkeys, aplysia californica, Caenorhabditis elegans, Primate (Macaca mulatta))917.3%
Does the study report quantitative or qualitative data?
N=313
Quantitative21067.1%
Qualitative3410.9%
Both6922%
What type of publication is being reported?
N=336 (Some studies have 2 or more types of publication is being reported.)
Data synthesis (e.g., systematic reviews, meta-analysis, scoping review)4513.3%
Cost-effectiveness and/or decision analysis20.6%
Clinical trial3811.2%
Observational study13841.5%
Case study/series41.2%
Pre-clinical (in vivo; in vitro; genomic)8425.0%
Multiple study types are reported20.6%
Other236.6%

In about half of the included publications (52%; n = 162), the Neuro based author was the corresponding author on the paper. Neurology 2.9% (n=9), NeuroImage 2.9% (n=9), PLOS One 2.6% (n=8), Brain 2.6% (n=8) and Proceeding of the National academy of science 2.6% (n=8) were the most prevalent journals that Neuro authors published in (see Table 2). The number of authors in the included papers ranged between 1-374 authors. Studies with more than 100 authors were typically a systematic analysis, such as GBD 2016 Traumatic Brain Injury and Spinal Cord Injury Collaborators. Global, regional, and national burden of traumatic brain injury and spinal cord injury, 1990-2016: a systematic analysis for the Global Burden of Disease Study 2016.26 The median number of authors was eight. In addition, we investigated the number of authors with a Neuro affiliation. We found that 31% (n=104) of the publications had a Neuro affiliated author. The number of authors with the Neuro affiliation ranged between 1-12 in the included publications. The median number of the Neuro affiliated authors on a given publication was two.

Seventy percent of the included papers reported on clinical research (n=220), while 18.2% (n=57) reported non-human data, and 11.5% (n=36) used another form of data, such as literature review, protocol, both clinical research and non-human and simulation analyses.

When we examined the type of clinical data shared, we found that biological data (e.g., implementation of an antibody) was used in 18.2% (n=55) of publications, DNA in 11.2% (n=34), brain MRI in 32.2% (n=97), non-MRI imaging in 9.6% (n=29), behavioral data in 18.2% (n=55), sequencing data in 4.0% (n=12) and clinical chart in 3% (n=9) of publication. Sleep studies, web-based surveys, stem cells, brain tissue from autopsies, OCT testing, interviews, EEG recordings were used in the “other” 3.6% (n=11) publications.

We also examined data sharing of non-human research publication. We found that animal data was being used in 91.2% (n=52) publications, five (8.8%) publications used other data (e.g., radiology technology, model). Mice 48.1% (n=25) and rats 34.6% (n=18) were being used in most of the publications. Sixty-seven percent (n=210) of the publications reported quantitative data and 10.9% (n=34) reported qualitative data. Twenty-two percent (n=69) of publications reported both quantitative and qualitative data. Forty percent of publication (n=138) were observational studies.

For full details of types of data, please see Table 2.

Data sharing audit findings

Complete results of our data sharing audit are reported in Table 3; here we report key findings. More than half of the publications (66.5%; n=208) included a data sharing statement. Two-thirds of the data sharing statements were found at the end of the paper in a ‘declaration’ section of the publication (66.0%; n=139/208). Other statements were found in the manuscript methods section 19.7% (n=41), in the manuscript results section 2.9% (n=6), or in other sections 26.0% (n=54) such as the front page, abstract, Dryad, or the Supplementary Materials section of the publication. Of the papers that had a data sharing statement, we were able to download 74.5% (n=155/208) of data from the included publications directly. Of the entire sample of articles included, 49.5%; 155/313) of the papers had openly available data. When examining publications with data sharing statements that did not have openly available data, 58.7% (n=27 of 46) of them specified to contact the author to gain access to the data. Eleven percent (n=5) of the publications identified other ways to access the data (e.g., registration to a site required, request through the Vivli platform which gives access to anonymized individual participant-level data (IPD) or the cleaned raw data that is collected during a clinical trial.27 Thirty percent (n=14) of these publications did not specify a way to gain access to the data.

Table 3. Data sharing evaluation.

Data sharing evaluationN%
Did the paper include a data sharing statements?
N=313
Yes20866.5%
No10433.2%
Unclear (broken link)10.3%
Where did you find the data sharing statement?
N=208(the number of manuscripts with data sharing)
In the manuscript Methods section.4119.7%
In the manuscript Results section.62.9%
In a declaration section.13966.9%
Other (e.g. Front page, Abstract, By request, Dryad links throughout Supplementary Material section)5426.0%
Can you directly access, download, and open the data? (from articles that had data sharing statements, n=208)Yes15574.5%
No4622.1%
Other (e.g., By registration, Error page)73.4%
If No, does the statement specify to contact the author to gain access to the data? (N=46)Yes2758.7%
No1430.4%
Other (Registration required, must request an account, request through vivli platform, says attached at bottom of article)510.9
Where does the publication indicate the shared data is located?
N=182
Journal website12066.0%
Other (NeuroVault, Zenodo, github, Genbank, NITRC/ABIDE, bioRxiv website, Dryad)3016.5%
Institutional repository179.3%
Open Science Framework84.4%
Figshare73.8
In what format is data shared?
N=204
PDF6130.0%
Word5929.0%
Excel2914.2%
HTML20.9%
R41.9%
SPSS10.5%
Other (3D view, Python Scripts, CSV TIFF, tar.gz, multiple histological datasets, zip, wav, Matlab, avi (video), txt, js, HTTP, mp4, png, table, ATAV)4823.5%
Does the data published have a unique digital identifier?
N=155
Yes4529.0%
No10366.5%
Unclear (e.g., Can’t open the link)42.6%
Other (the same DOI as the paper itself)31.9%

When data was shared openly it was done so on: journal websites (n=120, 66.0%), the Open Science Framework (n=8, 4.4%), Institutional repository (n=17, 9.3%), Figshare (n=7, 3.8%), or using other platforms (n=30, 16.5%). Data was shared mostly in PDF format (n=61, 30.0%), Word format (n=59, 29.0%), and Excel format (n=29, 14.2%). Most of the data 66.0% (n=103) shared was published without a unique identifier, with just 29.0% publications (n=45) providing a Digital Object Identifier (DOI). It was unclear for 2.6% (n=4) publications if a DOI was used.

Open scripts and materials

We found that most publications 75.7% (n=237) indicated that there is no analysis script and/or statistical analysis plan (SAP) available. Twenty percent (n=64) of the publications had analysis/SAPs available. Four (1.3%) publications stated that the analysis scripts and statistics data are available upon request. Two percent (n=8) were other (e.g. available from the corresponding author on reasonable request). The analysis scripts were available via a personal or institutional webpages (n=9, 11.5%), a journal webpage (n=16, 20.5%), supplementary information hosted by the journal or a pre-print server (n=16, 20.5%), or an online third party repository (e.g., OSF, Figshare, etc.) (n=26, 33.4%).

When publications shared an analysis script, the vast majority, 93.7% (n=60/64), could be downloaded directly. There was only one publication (n = 64; 1.6%) where we could not download the analysis script.

We defined “Materials availability” as sharing materials used that were to conduct the stud (e.g., such as video of Cognitive Behavior Therapy for psychological interventions; surveys; cell lines; reporting checklist(s), supplementary files, Gene banks). We found that (n=137, 43.8%) of the publications indicated that study materials were available. Less than one percent (n=2) of the publications indicated that the materials were available upon request. Materials were made available via the journal publication webpage (n=51, 30.9%), supplementary information hosted by the journal (n=51, 30.9%), using personal or institutional webpages (n=15, 9.2%), or via an online third-party repository (e.g., OSF, Figshare) fourteen percent (n=23). A small number of articles indicated materials were available by request (n=6, 4.4%). Six presents of materials (n=9) could not be downloaded directly, 92% (n=126) of materials in the publications could be downloaded, and two (1.5%) publications did not use them in their publications. Both publications were review.28,29

All of these results are included in Table 4.

Table 4. Analysis scripts and material availability.

N%
Does the publication state whether or not analysis scripts and statistics data are available?
N=313
Yes - the statement says that the analysis scripts and statistics data are available.6420.4%
Yes - the statement says that the analysis scripts and statistics data are available upon request.41.3%
No - there is no analysis script and statistics data availability statement23775.7%
Other (contact the author, available from the corresponding author on reasonable request)82.6%
How does the statement indicate the analysis scripts are available?
N=78
A personal or institutional webpage911.5%
A journal webpage1620.5%
Supplementary information hosted by the journal or a pre-print server1620.5%
An online third-party repository (e.g., OSF, Figshare, etc.)2633.4%
Other1114.1%
Can you access, download, and open the analysis and statistics files? (From the n=64 publications that included a statement about analysis script/data availability)Yes6093.7%
No11.6%
Other34.7%
Does the publications state whether or not materials are available?
N=313
Yes - the statement says that the materials are available13743.8%
Yes - the statement says that the materials are available upon request.20.6%
No - there is no materials availability statement16151.4%
Other (In text links given, in text, materials given in Methods section, from the corresponding author upon reasonable request)134.2%
How does the statement indicate the materials are available?
N=165
A personal or institutional webpage159.2%
Supplementary information hosted by the journal5130.9%
An online third party repository (e.g., OSF, Figshare, etc.)2313.9%
Upon request from the authors63.6%
On the first page of the preprint servers.21.2%
A journal webpage.5130.9%
Other (Open Science Badge - in article, appendix In the Materials section, matlab, list at the end of the versions of software Dryad, available without restrictions. coffeylab.ca/open-science/)1710.3%
Can you access, download, and open the materials files?
N=313
Yes12692.0%
No96.5%
Not applicable21.5%

Transparency and open science practices

Results from our audit of transparency and open science practices are reported in Table 5. Fifteen percent (n=47) of the publications did not mention a conflict-of-interest statement, 15% (n=47) indicated that there were one or more conflicts of interests and 68.8% (n=215) indicated that there was no conflict of interest. In addition, we investigated whether the publications reported funding and, if so, funding sources. Most of the publications (88%; n=296) reported the funding statements. The primary sources of funding were the federal public funder, the Canadian Institutes of Health Research (CIHR; n=149) and the provincial funder, the Fonds de Recherche du Québec – Santé (FRQS; n=71).

Table 5. Open access evaluation.

N%
Does the publication include a statement indicating whether there were any conflicts of interest?
N=313
Yes - the statement says that there are one or more conflicts of interest4715%
Yes - the statement says that there is no conflict of interest21568.7%
No - there is no conflict of interest statement4715%
Other41.3%
Does the article link to an accessible protocol?
N=313
Yes92,9%
No29794.9%
It is available upon request.51.6%
Other (Publication is a protocol)20.6%
Does the publication state whether or not the study (or some aspect of the study) was registered? N=313Yes - the statement says that there was a registration82.6%
Yes - the statement says that there was NO registration51.6%
No - there is no registration statement30095.8%
Where does the publication indicate the registration is located?ClinicalTrials.gov787.5%
PROSPERO112.5%
Does the publication report ethics approval? N=8Yes20866.5%
No4113.4%
It is not required for this study6420.4%
Does the paper report using reporting guidelines?
N=313
No30296.5%
Yes113.5%
Does the publication include a statement indicating whether there were funding sources?
N=313
Yes - the statement says that there was funding from a private organization(s)113.5%
Yes, the statement says that there was funding from a public organization(s)14646.6%
Yes - the statement says that there was funding from both public and private organizations9329.7%
Yes - the statement identifies funding sources, but the private/public status is unclear227.0%
Yes - the statement says that no funding was provided61.9%
No - there is no funding statement3511.2%
The publications status
N=313
Open access21970.0%
Not open access9129.0%
Unclear31.0%
Open access status
N=219
Green6228.3%
Bronze4420.1%
Gold8337.9%
Hybrid3013.7%

Most publications (94.9%; n=297) did not link to an accessible protocol; only 2.9% (n=9) did. 1.6% (n=5) of the publications reported that a protocol was available upon request, and two (0.6%) publications were actual reports of publication protocols.

Most publications (95.8%; n=300) did not include a study registration statement. Only 2.6% (n=8) of the publications indicated registration, 1.6% (n=5) of them indicated that there was no registration. Most registrations 87.5% (n=7) were for clinical trials referring to registration on ClinicalTrials.gov; 12.5% were located on the International Prospective Registry of Systematic Reviews, the PROSPERO database (n=1).

For studies in which ethics approval was considered approval, 66.6% (n=208) reported this information. Ethics approval was not relevant or not required for 20.4% (n=64) publications. Most publications (96.5%; n=302) did not report using a reporting guideline to improve the completeness and transparency of the completed research, only 3.5% (n=11) publications explicitly mentioned adherence to reporting guidelines.

We found that 70.0% (n=219) of the articles were open access, while the 29.0% (n=91) were not openly available. Data were missing for 1.0% (n=3) of the journals. Among the studies that were openly available, 37.9% (n=83 of 219) were in ‘gold’ (i.e., the final published version of your article (or Version of Record) is immediately permanently and freely available online) open access journals. Thirteen percent (n=30 of 219) were published in hybrid journals, 20.1% (n=44 of 219) were in ‘bronze’ (i.e., the final published version of your article is free to read on the publisher page but without a creative commons license) journals, and 28.3% (n=62 of 219) were made available green (i.e., when you place a version of the your manuscript into a repository, often after an embargo, making it freely accessible for everyone) open access via a repository (see Table 5).

Study 2 A cross-sectional online survey of Neuro-based researchers Survey

Participants

Of the 553 individuals who were emailed the survey, 22.42% (n=124) completed it. We removed 10 participants for analyzing the data as they had answered only demographics question.

Demographic details of these participants are provided in Table 6. Forty-seven (41.1%) participants were female and 66 (57.9%) were male. one participant did not respond to the question concerning gender (0.9%). Most of the participants were between 35 and 44 years of age, 91.4% (n=70 of 114) had a PhD degree, Of the 114 participants who answered the question about their research role, slightly more than a quarter of them 36.8% (n=42) were staff (manager, associate, assistant). Approximately one in three 32.5% (n=37 of 114) were trainees (e.g., MSc, PhD students, postdoctoral fellow).

Table 6. Demographic data.

Demographic dataN%
Gender
Answered (n=114)
Skipped (n=0)
Female4741.2%
Male6657.9%
Prefer not to say10.9%
Age
Answered (n=114)
Skipped (n=0)
18-24108.8%
25-343127.2%
35-443429.8%
45-541714.9%
55-641916.7%
65 or older32.6%
Degrees
Answered (n=114)
Skipped (n=0)
Bachelor's degree1412.3%
Master's degree2118.4%
MD65.3%
PhD7061.4%
Prefer not to say10.9%
Other (please specify)21.8%
Status at The Neuro
Answered (n=114)
Skipped (n=0)
Trainee (e.g., Msc, PhD student, Postdoctoral fellow)3732.5%
Staff (manager, associate, assistant)4236.8%
Principal Investigator3328.9%
Prefer not to say21.8%

Data sharing practices and Training in data sharing

We asked participants two questions about their data sharing practices in the past 12 months (see Table 7). Over half of those who responded 61.4% (n=70 of 114) reported that they published a first or last authored paper in the last 12 months. Of these, 70.0% (n=49) indicated that they openly shared research data related to one of their publications.

Table 7. Data sharing experience and training.

N%
Have you published a first or last authored manuscript in the last 12 months?
Answered (n=114)
Yes7061.4%
No4438.6%
For those who had been a first/last author in the past 12 months:
In the past 12 months, have you openly shared research data related to anything you have published yourself as first or last author?
Answered (n=70)
Skipped (n=44)
Yes4970.0%
No2130.0%
Have you engaged in training (online webinars, workshops, or a course) around data sharing?
Answered (n=113)
Skipped (n=1)
Yes - within the last 12 months4338.0%
Yes - within the last 3 years2017.8%
Yes - 3 or more years ago43.5%
Never4640.7%

We asked one question about past and future engagement in training about data sharing. The results are presented in Table 7. Approximately 40% of the respondents indicated that they had never engaged in training related to data sharing (e.g., online webinars, workshops, or a course) (n=49 of 113). Approximately a third of the respondents (38%; n=43) reported having engaged in some form of training (e.g., online webinars, workshops, or a course) around data sharing within the last 12 months. Also, twenty (17.8%) respondents reported having engaged in training (online webinars, workshops, or a course) around data sharing within the last three years and few respondents (3.5%; n=4) indicated they engaged in training around data sharing more than three years ago.

Participants indicated a preference for an online training video that they could return to and a series of several modules, each lasting about 10 minutes and a live webinar were less preferred by participants. These results are reported in Table 8.

Table 8. Preferences for data sharing training format.

Format of the data sharing training1234TotalMean scoreSD
A single module lasting about 2 hours20 18.87%36 33.96%28 26.42%22 20.75%1062.491.03
An online video you could return to45 41.67%28 25.93%26 24.07%9 8.33%1081.991.00
A series of several modules, each lasting about 10 minutes23 21.30%26 24.07%29 26.85%30 28.7%1082.611.10
A live webinar21 19.63%18 16.82%22 20.56%46 42.99%1072.871.17

Participants showed a preference for an online handbook walking through the practical steps of data sharing. Next, they indicated they would value having access to a data sharing expert (hired by the Neuro) to directly consult with when questions arise about data sharing. The order of the preference for other resources was: an online interactive learning module, an online video walking through the practical steps of data sharing (why, where, how), a central data sharing expert that facilitates data sharing for projects working directly with the project team, a collection of best practice case study examples (see Table 9).

Table 9. What types of data sharing resources would be most helpful for you? (scale=6).

What types of data sharing resources would be most helpful for you?123456TotalMean scoreSD
An online interactive learning module22
20.95%
17
16.19%
16
15.24%
15
14.292%
12
11.43%
23
21.90%

105

3.45

1.84
An online handbook walking through the practical steps of data sharing (why, where, how)27
25.47%
20
18.87%
22
20.75%
17
16.04%
14
13.21%
6
5.66%

106

2.90

1.55
An online video walking through the practical steps of data sharing (why, where, how)11
10.38%
19
17.92%
25
23.58%
23
21.70%
14
13.23%
14
13.21%

106

3.49

1.52
Access to a data sharing expert (hired by the Neuro) to directly consult with when questions arise about data sharing22
20.75%
24
22.64%
13
12.26%
24
22.64%
13
12.26%
10
9.43%

106

3.11

1.62
A collection of best practice case study examples7
6.60%
6
5.66%
23
21.70%
16
15.09%
29
27.36%
25
23.58%

106

4.22

1.49
A central data sharing expert that facilitates data sharing for projects working directly with the project team18
16.82%
21
19.63%
8
7.48%
10
9.35%
23
21.50%
27
25.23%

107

3.75

1.89

Perceptions about data sharing

Participants were most familiar with patient privacy considerations when sharing data (Mean=3.14, SD=1.091), the ethical considerations when sharing their data (Mean=3.13, SD=1.001) and practical steps involved to share their data (Mean= 2.91, SD=1.085). Respondents were less familiar with concepts including First Nations Principles of Ownership, Control, Access, and Possession (OCAP)7 (Mean=1.45, SD=1.710) and new metrics to measure data sharing contributions (Mean=1.86, SD=0.872). See Table 10 for full results.

Table 10. Familiarity with data sharing concepts (Not at all familiar=1, Not so familiar=2, Somewhat familiar=3, Very familiar=4, Extremely familiar=5).

Data sharing conceptsNot at all familiarNot so familiarSomewhat familiarVery familiarExtremely familiarTotalWeighted averageSD
Federal mandates related to sharing your data35
31.53%
27
24.32%
28
25.23%
19
17.12%
2
1.80%

111

2.33

1.147
Legal barriers related to sharing your data18
16.22%
35
31.53%
42
37.84%
14
12.61%
2
1.80%

111

2.52

0.971
The new metrics to measure data sharing contributions45
40.54%
42
37.84%
20
18.02%
3
2.70%
1
0.90%

111

1.86

0.872
Ethical considerations when sharing your data10
9.01%
13
11.71%
47
42.34%
35
31.53%
6
5.41%

111

3.13

1.001
Copyright considerations when sharing your data16
14.41%
31
27.93%
41
36.94%
17
15.32%
6
5.41%

111

2.69

1.068
Patient privacy considerations when sharing data10
9.01%
18
16.22%
41
36.94%
31
27.93%
11
9.91%

111

3.14

1.091
The FAIR Principles for data sharing39
35.14%
27
24.32%
23
20.72%
18
16.22%
4
3.60%

111

2.29

1.209
The First Nations Principles of OCAP72
64.86%
30
27.03%
8
7.21%
0
0.00%
1
0.90%

111

1.45

1.710
Practical steps involved to share your data16
14.68%
14
12.84%
50
45.87%
22
20.18%
7
6.42%

109

2.91

1.085

Most of the respondents expressed their opinion that data sharing helps to stimulate new hypotheses from the study (Mean=4.38, SD=0.671), they want to help others to use their study results (Mean=4.34, SD=0.691), data sharing helps the advancement of their research by allowing additional investigators to access the data for future research (Mean=4.33, SD=0.730), that they want to help others to reproduce their study (Mean=4.31, SD=0.767), that they want to help others to transparently assess their study (Mean=4.28, SD=0.720), and that they are optimistic that efforts to adhere to data sharing best practices will help support greater reproducibility and transparency of research (Mean=4.27, SD=0.862). Most of the respondents think that the benefit of data sharing is delayed and uncertain (Mean=2.51, SD=0.989). Also, most respondents disagreed that there is sufficient financial support to help them to adhere to data sharing best practices in the coming year (Mean=2.57, SD=1.016), that they feel stressed out when they think about how to adhere to best practices regarding data sharing for their studies in the coming year. (Mean=2.72, SD=1.014). For full results, see the Extended data.

Thematic analysis

For the item “What incentives do you think the Montreal Neurological Institute-Hospital could introduce to recognize data sharing?” we classified text-based responses into six themes. The themes were: 1) financial support, 2) recognize and incentivize data sharing, 3) provide infrastructure to support data sharing, 4) enforcement of specific and clear data sharing standards, 5) change research and recognition priorities, and 6) provide educational support for data sharing (see Table 11).

Table 11. Incentives to recognize data sharing.

Theme numberThemeDefinitionCodesNExample
1Financial supportProviding money in diverse formats to researchers to support and encourage data sharing effortsProvide funding for data sharing2Funding for hosting the data
Waiving or supporting open access publishing fees1Many universities (e.g. Cambridge) have agreements with publishers to waive open access fees.
Provide scholarships for data sharing1Give scholarships and recognitions
Provide grants for data sharing2Consider adding additional points in grant and fellowship application evaluation
Provide waivers for data acquisitions1waivers for data acquisitions
Provide prizes for data sharing9Providing Prizes
2Recognize and incentivize data sharingProviding promotion and recognition as an outcome for data sharing to encourage researchers to share their dataProvide promotion to researcher7By recognizing this as a facilitator for promotion.
Provide merit reviews to researcher1Open data practices could be evaluated as equal to formal publications in annual merit assessments.
Engage department and faculty in providing incentives to share data1the Department and Faculty are the true employers and evaluators.
Provide media coverage1Media coverage (e.g. through social networks)
Competitions for data sharing innovations1Promotion of shared data from Neuro researchers within McGill/The Neuro (newsletter, etc) to appreciate them and facilitate collaboration/continued innovation with said shared data
3Provide infrastructure to support data sharingProviding technical support for data sharing in the form of online infrastructure or supportProvide appropriate storage for data sharing2Provide resources needed for data storage and sharing for free or low price
Provide increased access to databases2Increased access to databases
Provide technical support for data sharing2I am unsure if incentives are needed, for me and many trainees the barrier is the technicality and legality behind sharing data and not so much the "lack of an incentive". I truly believe that advancing research through the transparency of data sharing is an incentive itself
Offer an official data sharing platform1I think the biggest barrier to seeing benefits is likely to be a good repository to store the data. There is the server costs that need to addressed for making data public. Additionally, a good repository will have some search engine optimization so that it can easily be found. I think the best incentive would be for the Neuro to offer such a platform. Then the Neuro can post the necessary guidelines that are to be met for sharing your data.
Promote use of specific data sharing hubs1We could also promote the use of specific data sharing hubs, perhaps linking to neuro investigator uploads.
4Enforcement of specific and clear data sharing standardsProviding clarity about data sharing expectations to researchers by setting widespread guidelines and standardsProvide structured guidelines for data sharing2Clear guidelines
Make data sharing practice a baseline standard for researchers1More importantly, the microscopy facility also lacks the manpower and expertise to spread any sort of data standards - with only 1 and a half employees it is challenging enough for them to even keep our many microscopes running. I think that before cell biologists can be incentivized to productively share data, we must have a basic level of digital infrastructure, and need to spread standards amongst researchers (especially new trainees) ideally through a collaboration between the microscopy facility and data sharing experts from other fields
Provide good examples of data sharing for researchers1The big labs and PIs can start sharing their data to set an example. Neuro can also try to anonymize vast clinical data available at the center to showcase how it can be done properly and at large scale and set an example for the rest of the community
Provide an embargo period on data sharing after data collection1I strongly believe that there should be an embargo on data sharing after the data has been collected, to recognize all these efforts and allow for the people who are doing the legwork to benefit from it first. It bothers me a lot that so many people promote data sharing, yet, the idea of embargo is not one that I hear a lot (in fact, it is never present in all the seminars I attended on data sharing). Personally, I am against sharing data without an embargo period that should end once the first study/report is published or a certain amount of time has elapsed, whichever comes first.
Make data sharing mandatory during the research process1Making data sharing mandatory for researchers before receiving a new research fund (or renewing an existing research fund): establishing a clear plan of how and when data will be shared before even starting the research
Make data management plans mandatory before starting research1Establishing a clear plan of how and when data will be shared before even starting the research
5Change research and recognition prioritiesRecommendations for a change in priorities in recognition of research in order to encourage data sharing. A change in values would be supported by a change in the research structures and systems.Place less of an emphasis on recognition of a researcher's publications1Recognize DOIs in promotion in place of publications.
Place less of an emphasis on delegation of funds1Place less of an emphasis on delegation of funds
Provide direct benefits to researcher for data sharing1For most of us basic scientists working in cell & animal models, we share our data through open access publications and sharing on request. If those working on big data, brain imaging, or 'omics' are to receive a benefit or incentives that is unfair to those who don't need to or have appropriate data for sharing.
6Provide educational support for data sharingProviding human and training resources for data sharing to facilitate data sharing practices and to support researchers during the research process.Provide more opportunities for data sharing to be practiced1Make some postdocs/research associates longer-term offers where they can generate data worth sharing (and share it).
Provide training for data sharing2Give us more training about it
Provide resources at a low/no price for data sharing1Provide resources needed for data storage and sharing for free or low price
Provide academic support for data sharing1Support for academic promotions and tenure.
Specialized staff for data sharing6Access to expert in data sharing which would provide a very detailed and thorough custom step by step guide for sharing data for each specific project.
Provide internal support for research data management1Support to best use the resource

For the question “Is there anything else you want to share about data sharing?” we classified responses into eight themes. The themes were: 1) lack of resources for data sharing, 2) barriers to data sharing, 3) recommendations for standards surrounding data sharing, 4) relevancy of data sharing, 5) provide training for data sharing, 6) privacy considerations when sharing data, 7) technical requirements for data sharing, 8) data sharing requires more resources (see Table 12).

Table 12. Comments about data sharing.

NThemeDefinitionCodesNExample
1Lack of resources for data sharingDifficulty for data sharing to be put in practice because researchers lack time and resources, which would be put in better use for other aspects of their researchLack of time for data sharing2But it's hard to find the time to do data sharing
Lack of resources for data sharing2Weird for an Open Science institution: there is no shared catalog of resources (mouse lines, cell lines, plasmids, equipment, etc.) available at the Neuro.
2Barriers to data sharingBarriers existing to prevent and discourage researchers from data sharingPublishing barriers to sharing data1The other critical issue that prevents data sharing for the purpose of reproducibility and transparency is the lack of journals that accept purely reproductive or replication research (i.e. studies trying to replicate previous fundings) or journals that will publish negative findings. These are majore barriers to promoting good science and encouraging people to share their data.
Technical barriers prevent data sharing3The "technical" barriers I face aren't (e.g.,) knowing how to upload data; it's that hosting our enormous raw data files is expensive and not supported by OSF: 5 Gb is literally 5 minutes of our experiments. A "career" issue I face is that we often collect datasets that let us answer several questions simultaneously and sharing that seems like a recipe to be scooped. I also don't think the incentives are really aligned: an additional paper is far, far more valuable to ECRs than a citation.
Research Ethics Board constrains data sharing3The main barrier to data sharing in my work is much REB constraints (on whether data can be shared or not, and which repositories are acceptable for hosting data), and that some of my work is in patients for which data sharing is much more sensitive and hard to get REB-approved.
Ownership/misuse of data concerns when sharing data1I'm a research assistant, so I have not my own "study data" to share. I agree that data sharing will help accelerate scientific progress but privacy and personal data sharing must be regulated.
Involvement of the legal system in scientific research is a burden1I think the involvement of the legal system and especially lawyers into the mix of scientific inquiry have been a burden and will continue to be. They are a millstone around the neck of scientific progress.
Lack of incentive to share data3We receive a lot of training opportunities on data sharing. But it's hard to find the time to do data sharing. And there is neither necessity to do it, nor any major incentive.
3Recommendations for standards surrounding data sharingThese items should be considered when widely enforcing and encouraging data sharing in researchData sharing standards must accommodate diversity in research2I don't do brain imaging for which there are a lot of standards. Transcriptomic data is easy to share because there are repositories for this and so this data is easy for me to share because there are clear instructions. Other data types, microscopy imaging, flow cytometry, MEA do not have clear standards or repositories (that I am aware of). I would like to share data but I don't know how. I share my code on github, but I don't know if I'm doing thing correctly.
Data sharing requires innovation1we need to innovate, not just be followers.
Data sharing standards must accommodate diversity in research1Guidelines for adequate data sharing practices should be introduced to all Neuro labs. There are wast differences in the type of research (thus data produced) conducted at the Neuro, that should be openly shared however labs need to know the recommended best practices: which omics repositories to use, Western blot repositories, when to publish on pre-print servers, etc.
Provide guidelines for data sharing1Doing it correctly is not clear and having explicit online guidelines devoid of jargon and preassumptions would be great. I find that I cannot even get started often because the resources assume greater expertise in database management than I have.
4Relevancy of data sharingData sharing can feel more or less important depending on a researcher's specific field or who may benefit from data sharingData sharing is not relevant to my research3Because my lab doesn't study people, my answers don't reflect any concerns about privacy
Data sharing should be more accessible to researchers and the public1However the general findings and how they are applicable for the public at large is important and I feel is lost in the process of publication.
5Provide training for data sharingProviding training and education about data sharing practices can encourage the practice of data sharingProvide training for data sharing3I'm a new master's student and I feel like I haven't been exposed to any data sharing information sessions. I'm at the point where I'm only beginning to generate data so sharing it isn't something that I've even thought of. I have almost zero knowledge on how to go about it. It could be nice to have some training on for new students so we can have these ideas in mind from the start of our careers in science.
Researchers need clarity on when to share data1I think the exact moment when sharing has to be carefully considered.
6Privacy considerations when sharing dataThere exist concerns when sharing data in specific research (e.g., patient data in clincal trials)Privacy concerns when sharing data2But privacy and personal data sharing must be regulated.
Data sharing privacy concerns are not relevant to all research1privacy and personal data sharing must be regulated.
7Technical requirements for data sharing
Data sharing requires more resources
Providing support for the technical requirements of data sharing is necessary for data sharing practices to be facilitatedProvide the required infrastructure for data sharing2We need tools and infrastructure that actually work and are inclusive of all research realities.
Provide repositories for data sharing1Guidelines for adequate data sharing practices should be introduced to all Neuro labs. There are waste differences in the type of research (thus data produced) conducted at the Neuro, that should be openly shared however labs need to know the recommended best practices: which omics repositories to use, Western blot repositories, when to publish on pre-print servers, etc.
Provide technical support for data sharing3It's more complicated than you think. The Neuro needs to put more support into the technical side of open data-sharing. Of course, the 'marketing' and administration of open neuroscience at the Neurologist is important but without a solid technical basis, the entire enterprise will fail.
8Data sharing requires more resourcesRequirements for human or financial resources in order for data sharing to be practicedData sharing requires more funding2Open Access is great and needs to be championed, however it is unreasonable to expect that every lab a) can do it, or b) should, due to tight funding and unethically competitive practices of other labs in the field
Provide internal support for data sharing1The Neuro needs to put more support into the technical side of open data-sharing

Discussion

The aim of the first study was to audit all publications produced by Neuro researchers in 2019. We found that 66.5% (n=208) of publications had a data sharing statements and 61% (n=31) of publications specified to contact the author to gain access to the data. Sharing by request creates burden and barriers for other researchers to access and use data and should only be implemented in instances where open data is not possible. Most of the publications indicated that there is no registration statements and protocol availability. We note that the authors are more eager to share their publications material than analysis scripts. Most publications did not report using a reporting guideline to maximize the transparency and completed of their research.

In the second study, to identify barriers and facilitators to data sharing, we found that there was a preference that training in the form of an online video they could return to, a series of several modules, each lasting about 10 minutes and a single module lasting about 2 hours be developed. A key finding was that more than a third of the respondents (40.7%), had not engaged in training around data sharing. We identified learning gaps in some key areas (e.g., OCAP Principles, metrics to measure data sharing contributions). Respondents noted that barriers they faced when sharing data included: financial support, training, and technical support.

We recommend an educational and training intervention devoted to data sharing practices to further normalize and support this practice, including training on sharing statistical analysis plans and sharing of other materials related to data sharing. This training could then be followed by additional sessions on further open science practices. Further research is needed to examine the universities’ data sharing practices to track needs and preferences over time and investigate to provide clear data sharing standards for the staff, researchers, and managers.

Also, it is important to repeat the audit over time to track changes in the Neuro publications and researcher perceptions and barriers and facilitators. Hence, it is important to repeat the audit biennially with the same questions in survey. It is also should be mentioned that in addition to the same questions, additional questions should be added to gain a better understanding about certain data sharing practices in universities and scientific institutes.

Strengths and limitations

The results of the two studies will provide the Neuro with a better understanding of open access statues and barriers and facilitators to data management and sharing and will identify educational needs related to data sharing that can be reduce barriers to data sharing. We hope this report also provides other organizations wanting to engage their communities about data sharing with a valid approach to the topic and some comparative data. Further, the audit of publications can be used to benchmark for improvements over time and to monitor change. Also, decision-makers in government and universities will be able to structure future open access by providing training for their researchers. Also, this study can serve as a baseline to benchmark for improvements in data sharing and other open science practices and to measure progress over time.

We acknowledge certain limitations. First, as less than a quarter of the Montreal Neurological Institute-Hospital’s staff completed this survey, the results of the study cannot be generalized. Also, it is hard to measure changes in the community unless two or more surveys are done in different time periods. Hence, we recommend annual surveys of the Neuro community to track changes in needs and preferences over time.

Registration

This study was registered a priori (i.e., before any data collection) using the Open Science Framework (https://osf.io/3tafc).

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 18 Oct 2023
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Ebrahimzadeh S, Cobey KD, Presseau J et al. A cross-sectional audit and survey of Open Science and Data Sharing practices at The Montreal Neurological Institute-Hospital [version 1; peer review: 1 approved with reservations, 1 not approved]. F1000Research 2023, 12:1375 (https://doi.org/10.12688/f1000research.138196.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 18 Oct 2023
Views
3
Cite
Reviewer Report 28 Nov 2023
Felix Nikolaus Wirth, Berlin Institute of Health, Berlin, Germany 
Approved with Reservations
VIEWS 3
The presented work is an assessment assessing (1) the data sharing and open science practices of the institution of the authors as well as (2) the perceptions of the institution’s staff regarding data sharing and open science.

... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Nikolaus Wirth F. Reviewer Report For: A cross-sectional audit and survey of Open Science and Data Sharing practices at The Montreal Neurological Institute-Hospital [version 1; peer review: 1 approved with reservations, 1 not approved]. F1000Research 2023, 12:1375 (https://doi.org/10.5256/f1000research.151380.r219458)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
14
Cite
Reviewer Report 03 Nov 2023
Evgeny Bobrov, QUEST Center for Responsible Research, Berlin Institute of Health at Charité, Berlin, Germany 
Not Approved
VIEWS 14
In the first part of the study (‘audit’), the authors present an analysis of 313 research articles from their institution, published in 2019. The authors determined by manual screening whether for these articles data had been shared, as well as ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Bobrov E. Reviewer Report For: A cross-sectional audit and survey of Open Science and Data Sharing practices at The Montreal Neurological Institute-Hospital [version 1; peer review: 1 approved with reservations, 1 not approved]. F1000Research 2023, 12:1375 (https://doi.org/10.5256/f1000research.151380.r219459)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 18 Oct 2023
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.