Keywords
Early Career Researchers, Open Science, Open Scholarship, Open Access, Open Peer Review, Scholarly Publishing, Science Communication, Europe
This article is included in the Research on Research, Policy & Culture gateway.
Early Career Researchers, Open Science, Open Scholarship, Open Access, Open Peer Review, Scholarly Publishing, Science Communication, Europe
In 2016, the European Commission (EC) presented a renewed vision for European research and innovation policy centred around the three O’s: ‘open innovation, open science and open to the world’,1 which has been continuously implemented by various groups of European science stakeholders such as universities, funding organizations, publishers etc.2 Open Science (OS) generally means transparent and accessible knowledge that is shared and developed through collaborative networks.3 OS is an umbrella term for various practices such as Open Access, Open Data, Open Methodology, Open Source, Open Peer Review (OPR), Open Reproducible Research, Open Education, Alternative Metrics, and Citizen Science.3–5 OS makes science more efficient, reliable, and responsive to societal challenges by opening up access to research data and results via new digital technologies and collaborative tools.6–8 Utilising OS, even if selectively and without commitment to total openness, brings many advantages to researchers including increased visibility, “liberation” from many perceived restrictions such as the need to produce only statistically significant results, and the fostering of creativity.9–11 The value of OS has become even more clear worldwide during the COVID-19 pandemic, which has highlighted the need for urgent access to scientific information, as well as the enhancement of scientific collaboration and knowledge-based decision making.12–14
For OS to become the dominant publication style, researchers need appropriate discipline-dependent skills training and professional development at all stages of their research careers.15 Moreover, the ideal time to build OS skill sets is early in the research career, especially since the benefits of OS for early career researchers (ECRs) are tangible (e.g., their research gets increased reliability and visibility) and should not be neglected.9,17 Thus, ECRs may be the key to definitively switching towards OS.16 However, for ECRs to become the change agents, significant challenges regarding widespread OS implementation (e.g. difficulties in adopting OS practices, access to tools and training, the time cost for additional requirements, and a lack of proper incentive and reward systems) must be addressed. In particular, these issues must be addressed during the current transition towards research openness due to the multidimensional and complicated nature of the changes needed.6,8,18 Widespread OS implementation requires a broader and more inclusive understanding of scientific contributions and expectations of productivity; thus a change in attitude needs to occur in academics at all levels, research organisations, and funders.9,19,20 For this to happen, regular and systematic monitoring of ECRs’ views, knowledge, and skills on OS and related policy adjustments at institutional, local, national, and international levels are crucial.
Previous studies have assessed the awareness of, and attitudes towards, OS or its specific components, such as Open Access publishing,21 OPR22 or preprints,23 but these were mostly focused on researchers within a specific scientific field e.g., social science, agriculture or health research24–26 or location, network or institution.27–29 Moreover, although some of these studies sought feedback from ECRs, their specific attitudes towards OS and new approaches to scholarly communication have been explored only to a limited extent due to the aspectual, field or location focused frames.9,17,24,26,30,31 In 2019, a transnational survey reported a significant degree of diversity in terms of scholarly communication attitudes and practices amongst ECRs in the eight countries studied,31 highlighting the need to study OS awareness on a multinational scale and across disciplines.
In recent years, research funding organisations, including those administered by the EC, have started introducing mandatory OS policies,5 as well as building tools and infrastructures intended to help researchers comply with the new requirements. With the vision to make the research outputs available to the public, the European Commission launched a new Open Access publishing platform, the Open Research Europe (ORE), in March 2021 for the beneficiaries of European Framework Programmes for Research and Innovation, Horizon 2020 and Horizon Europe. The ORE platform has been developed by F1000 Research as an Open Access publishing venue. The platform, using an innovative approach where publication and peer review are done independently, offers rapid publication of a wide range of research outputs under an open license, and OPR.
Eurodoc, the European Council of Doctoral Candidates and Junior Researchers, is a pan-European non-profit, volunteer organization, and federation of 29 national organisations, thus representing ECRs from European countries. In 2020, Eurodoc was among the organizations formally included as expert partners of ORE, with the mission to help steer the project and ensure that ECRs as stakeholders were reached. During the ORE platform development, Eurodoc members noticed a lack of comprehensive surveys on ECRs’ knowledge and attitudes toward publishing in OS encompassing the European sub-regions. Furthermore, previous analyses had rarely sought to address European countries' highly variable economic status, namely the financial resources allocated to science and how that relates to the awareness of OS in its broadest sense. Due to the fact that Eurodoc operates at the pan-European level and represents a wide network of member organizations acting in many European countries and regions, an opportunity for a survey at the European level emerged.
Consequently, the survey was dedicated to gauging perspectives on OS and contemporary scholarly publishing, and was conducted between May to August 2020, as part of the collaboration between Eurodoc and the ORE project. The survey questions were developed by Eurodoc members (OS Ambassadors and OS Working Group members, all of whom were ECRs in diverse fields of study, located in 18 various European countries and actively advocating for OS) during online meetings between April and May 2020. The survey authors shared the draft versions and collected feedback from experts during the ORE Communications Group meetings. Experts’ comments were incorporated into the final version of the questionnaire. The survey was openly accessible to everyone willing to participate and was distributed via Eurodoc channels, its partners and members: national organizations of doctoral candidates and junior researchers in Europe (through social media, websites, newsletters, mailing lists, and personal contacts).
The main aim of this exploratory study was to gain new perspectives regarding the awareness of, and attitudes towards, OS and related practices within Europe. In addition, the aim of this study was also to investigate the hypothesis that differences in attitudes might arise in relation to factors such as economic indicators, research and development investment, the field of study, and career stage. Though the study was initially designed to contribute to the ORE project, it grew beyond this aim. We believe the results presented here may be relevant to anyone interested in understanding the role of ECRs in OS and can therefore be valuable to researchers, their representatives, funders and decision-makers at institutional, national, and the EU level.
All respondents were informed about the background, aims and conditions of the survey through a dedicated web page that explained that all responses were anonymous and voluntary, and that all data would be kept confidential and evaluated anonymously in accordance with the Regulation (EU) 2016/679 (GDPR). Respondents were also informed that the survey data and results would be published. Written informed consent was obtained from the respondents (completion of the questionnaire was taken as consent to participate in the study). Ethical approval for the survey was waived by the Ethics Committee of the Lviv Polytechnic National University Academic Board. This waiver of approval was provided due to non-sensitive data obtained from the survey respondents. Choosing the EUSurvey platform as the questionnaire instrument allowed a high degree of privacy, as no email addresses, IP addresses, nor any other identifying data were collected alongside the responses. All free-text data was analysed separately from the main dataset.
The final survey version was available on the EUSurvey platform as an openly accessible online questionnaire. It was open between the 26th May and the 15th August 2020.
In order to maximize the number of responses, the survey was designed to be a brief to complete as possible, while still collecting meaningful data. The final version agreed by the study team took around seven minutes to complete and contained 35 questions: 14 of which were mandatory, four were optional, and an additional set of 17 dependent on answers to preceding questions.33 Most of the questions took the form of multiple choice, with free-text options available when choosing the response “Other”. Questions 1-2 gathered demographic data, whilst 3-7 and 9-11 assessed the academic and, specifically, the peer reviewing experience of participants, as well as their level of satisfaction regarding professional rewards for peer reviewing. Questions 12-16 examined how respondents communicate professionally and, in particular, disseminate their research outputs, as well as their level of satisfaction regarding the latter. The following questions then assessed knowledge/experience and general attitudes towards OS (questions 17-22) and OPR (questions 23-29). Question 8 and questions 30-34 sought feedback regarding participants’ motivations for choosing publishing venues, as well as attitudes towards the more “open” options and, specifically, ORE. The final question, 35, offered the opportunity to add any additional comments in a free-text format.
The total number of valid responses was 1187 (excluding two responses who were discarded as duplicates), which is comparable to other global and multinational surveys involving researchers.31,34,35 No exclusion was applied to career stage, age, gender, or field of study. However, we only selected answers from respondents in the countries belonging to the EU and/or Council of Europe for further analysis, according to the study aim. Survey organizers discussed the preliminary results in June 2020 during the online Eurodoc Conference.36
The data from the survey was analysed in consideration of five different grouping variables: European region, gross domestic product (GDP), gross domestic expenditure on research and development (GERD) as a percentage of GDP, field of study, and career stage.
European region
The survey was specifically aimed at studying the features of OS and publishing practices, and these are in turn directly influenced by the current research culture (i.e., behaviours, values, expectations, attitudes, and norms of research communities). As we wanted to explore further the potential impact of cultural differences on OS and the adoption of related practices, a division of Europe reflecting a “model of cultural spaces excluding national and political intentions while applying (…) critical factors for today's social, political and economic situations”37 was used. According to the above division, the responding countries were grouped into the following six regions:
• Central Europe (Austria, Croatia, Czech Republic, Estonia, Germany, Hungary, Latvia, Lithuania, Poland, Slovak Republic, Slovenia, and Switzerland)
• Eastern Europe (Azerbaijan, Georgia, Russian Federation, and Ukraine)
• Northern Europe (Denmark, Finland, Iceland, Norway, and Sweden)
• South-eastern Europe (Bosnia and Herzegovina, Bulgaria, Cyprus, Greece, Romania, Serbia, and Turkey)
• Southern Europe (Italy, Malta, Portugal, and Spain)
• Western Europe (Belgium, France, Ireland, Netherlands, and the United Kingdom)
GDP
The second variable these countries were grouped by, was the GDP per capita; an important indicator of national economic progress.38 As providing financial support to scientific research can be dependent on a country's economic status, wealthy states are typically in the position to spend more.39 This may also impact research culture in various countries and regions. Although relations between econometric and scientometric indicators might be indirect and need further investigation,39 the link between scholarly publication output of a given country (an indicator of the country's research activity) and GDP was reported in a particular field of study41 and across various research fields.40 For this variable, data from the World Bank42 was used (the GDP ranges were selected in an effort to create a balanced share of responses among the groups below). The currency used is international dollars; a hypothetical unit of currency that has the same purchasing power parity that the U.S. dollars had in the United States, at a given point in time,60 (in the case of these figures, October 2020):
• 2,500-10,000: Azerbaijan, Bosnia and Herzegovina, Bulgaria, Georgia, Serbia, Turkey, and Ukraine
• 10,000-25,000: Croatia, Czech Republic, Estonia, Greece, Hungary, Latvia, Lithuania, Poland, Portugal, Romania, Russian Federation, and Slovak Republic
• 25,000-40,000: Cyprus, Italy, Malta, Slovenia, and Spain
• 40,000-55,000: Austria, Belgium, Finland, France, Germany, Netherlands, Sweden, and United Kingdom
• 55,000 or more: Denmark, Iceland, Ireland, Norway, and Switzerland
GERD as a % of GDP
Since GDP may not be a simple and clear indicator to interpret,43 GERD as a percentage of GDP44 was also used for the analyses, and is mentioned by several studies as one of the most important contributing factors towards impactful scientific knowledge generation.41,45 According to GERD as a % of GDP, countries were divided into five groups (the ranges were again selected with the aim of achieving a balanced share of responses among the five groups):
• 0-0.75 %: Azerbaijan, Bosnia and Herzegovina, Cyprus, Georgia, Latvia, Malta, Romania, and Ukraine
• 0.75-1.25 %: Bulgaria, Croatia, Greece, Ireland, Lithuania, Poland, Russian Federation, Serbia, Slovak Republic, Spain, and Turkey
• 1.25-1.75 %: Estonia, Hungary, Italy, Portugal, and United Kingdom
• 1.75-2.25 %: Czech Republic, France, Iceland, Netherlands, Norway, and Slovenia
• 2.25 % or more: Austria, Belgium, Denmark, Finland, Germany, Sweden, and Switzerland
Field of study
As the ORE platform is aimed at Horizon 2020 program beneficiaries, the eight scientific areas defined in the H2020 programme guide for applicants46 were used for determining respondents' field of study, namely Chemistry, Economic Sciences, Information Science and Engineering, Environmental and Geosciences, Life Sciences, Mathematics, Physics, Social Sciences and Humanities, and Interdisciplinary.
Career stage (self-determined by responders)
Data was aggregated into the following categories:
• Junior positions: student, doctoral candidate in higher educational institution, industrial doctoral candidate, junior researcher, and junior teaching staff member
• Middle positions: researcher, independent researcher, middle teaching staff member, industry professional, and entrepreneur
• Senior positions: senior researcher, and senior teaching staff member
Finally, the survey contained questions to assess participants' perception of their career stage and various aspects of their professional experience, in particular peer-review articles and activities. This was important, in order to further characterize the sample by variables of interest such as OPR. Therefore, we can present additional analyses concerning perception of career stage (if respondents perceive themselves as ECRs or not) and scientific experience (if they had participated in a H2020 project, number of peer-reviewed research articles, number of research articles for which they have performed peer-review).
Analysis of categorical questions were assessed by evaluating the differences in the distribution of responses from each group by Pearson's Chi-squared Test. The analysis was performed on R v.3.6.0 using the method ‘chisq.test’ from the package stats. Stacked bar plots and radar plots associated with this type of questions were made in Microsoft Excel. Analysis of questions with numerical rating scales were analysed and plotted in GraphPad Prism® v. 9.1.0 and tested for differences between different categories with Kruskal–Wallis one-way analysis of variance followed by Dunn’s test for multiple comparison.
Region
As the survey was Europe-oriented and dissemination channels focused on European researchers, the vast majority of responders were located in Europe.33 We therefore considered only European countries’ responses (1162; 97.9%) into account for further analysis. The total number of European countries covered by the survey was 37 (Figure 1). The largest number of responses came from Ukraine (178; 15.3%), Spain (174; 15%), Germany (101; 8.7%), Italy (88; 7.6%), Denmark (80; 6.9%), Slovenia (66; 5.7%) and Poland (59; 5.1%).
Although there was heterogeneity in the number of responses across countries, the representation by region was more balanced: central Europe (339; 29.2%), southern Europe (30; 25.9%), eastern Europe (202; 17.4%), western Europe (157; 13.5%), northern Europe (142; 12.2%), and south-eastern Europe (21; 1.8%). Finally, answers from all European regions were obtained, but the number of respondents from south-eastern Europe was substantially lower compared to the other regions. Consequently, we have abstained from performing further analysis including this region.
GDP
The representation of country groups according to GDP was as follows: 2,500-10,000-214 responses (18.4%); 10,000-25,000-189 (16.3%); 25,000-40,000 - 330 (28.4%); 40,000-55,000-303 (26.1%); 55,000 or more - 126 (10.8%).
GERD as % of GDP. The representation of country groups according to GERD as % of GDP was as follows: 0-0.75% - 232 (20%); 0.75-1.25% - 278 (23.9%); 1.25-1.75% - 179 (15.4%); 1.75-2.25% - 177 (15.2%); 2.25% or more - 296 (25.5%).
Age
Age was one of the mandatory questions: 5 respondents were under 18 years old (0.4%), 58 between 18 and 24 (5%), 403 between 25 and 29 (34.7%), 320 between 30 and 34 (27.5%), 293 between 35 and 44 (25.2%), 56 between 45 and 54 (4.8%), 18 between 55-64 (1.5%) and 9 over 65 years of age (0.8%). The majority of responders were between 25 and 44 years old (1016; 87.4%) and the average age was approximately 33.The respondents under the age of 18, considered themselves ECRs. Hence, we included their responses into the analysis; classifying them as First Stage Researchers (R1) according to the EURAXESS classification, meaning they are considered capable of at least “carry research under supervision”.61
Field of study
The division of respondents according to their field of study was as follows: Social Sciences and Humanities (304; 26.2%), Life Sciences (303; 26.1%), Information Science and Engineering (155, 13.3%), Physics (94, 8.1%), Interdisciplinary (88, 7.6%), Chemistry (79, 6.8%), Environmental and Geosciences (78, 6.7%), Economic Sciences (49, 4.2) and Mathematics (12, 1%). As the field of Mathematics had substantially fewer answers than all other categories, we abstained from using this category in further analysis.
The representation of various categories of researchers regarding their career stage was as follows:
• Junior positions (799, 68.8%): student (44, 3.8%), doctoral candidate in higher educational institution (535, 46%), industrial doctoral candidate (15, 1.3%), junior researcher (177, 15.2%), junior teaching staff member (28, 2.4%)
• Middle positions (238, 20.5%): researcher (168, 14.5%), independent researcher (29, 2.5%), middle teaching staff member (16, 1.4%), industry professional (20, 1.7%), entrepreneur (5, 0.4%)
• Senior positions (87, 7.5%): senior researcher (64, 5.5%), senior teaching staff member (23, 2%)
• Other (38, 3.3%)
Finally, to further characterize the sample, we also assessed the respondents’ perception of their career stage and their scientific experience:
• The majority of respondents (904, 77.8%) considered themselves ECRs, while 122 (10.5%) were unsure
• 250 respondents (21.5%) participated in a Horizon 2020 research project
• The distribution of respondents regarding the number of peer-reviewed research papers they had published was as follows: no papers (156; 13.4%), 1-5 (514; 44.2%), 6-10 papers (193; 16.6%), 11-20 papers (151; 13%), 21-50 papers (93; 8%), > 50 papers (55; 4.7%)
• More than half of the respondents (606, 52.2%) had performed a peer review of research publications, either in their own name or together with more senior academics. Among those who’d done the latter, the division regarding the number of Peer reviews they had performed was as follows: 1-5 reviews (327; 28.1%), 6-10 reviews (93; 8%), 11-20 reviews (66; 5.7%), 21-50 reviews (67; 5.8%), >50 reviews – (45; 3.9%).
Our analysis of the responses based on scientific experience demonstrates that most researchers typically begin peer reviewing after they themselves have published five or more research papers (Figure 2).
In this section of the survey, we investigated how familiar researchers were with OS and their attitudes towards it and its specific components.
This question explored the general awareness regarding OS, in order to determine the most and the least aware groups across different grouping variables. It was the “key” question, unlocking the other questions from the “OS section” for those who answered positively. The majority of respondents (76%) had heard about OS. However, we found differences across groups, mainly relating to the region of Europe, GDP, and GERD as % of GDP (Figure 3).
Respondents were asked if they had heard of OS (answered by indicating either “Yes” or “No”). Cumulative bar plots display the percentages of reported awareness for (A) all respondents, or (B) based on European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. Statistical differences were evaluated with a Chi-squared test - results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
This question was available only for those who indicated that they had heard about OS (76% of the survey participants). It explored the level of OS knowledge among respondents, who were offered a 10-point scale for self-assessment, with the lowest mark (1) meaning “have just heard about it” and the highest (10) meaning “advanced expert.” The aim was to determine the most and the least knowledgeable groups across different grouping variables.
We found that the highest level of self-assessed knowledge of OS was in researchers in western European countries (Figure 4A), by a considerable margin. Equally, the highest knowledge level, based on GDP per region (Figure 7B), was found in the 40,000 – 55,000 range, with much of this group containing nations in western Europe. This trend shows that researchers from this region are most knowledgeable and positive in regards to OS; this trend is visible in other analyses below also). However, we found no significant differences between GERD as % of GDP categories. Among research disciplines (Figure 7D), the researchers in Information Science and Engineering and Interdisciplinary science reported the greatest knowledge of OS. We also found knowledge of OS appears to increase with increasing career seniority (Figure 7E).
Abbreviations: Sc.: Science, Eng.: Engineering.
The graphs illustrate the mean and standard deviation of the responses from each group, with the y-axes representing the numerical rating scale and x-axes displaying the different categories assessed. General statistical significance was measured by Kuskall-Wallis test and is indicated above each panel. Stars indicate significance for multiple comparisons with Dunn’s test, with corrected p-values. * P<0.05, ** P<0.01, *** P<0.001, **** P<0.0001.
The next question sought feedback regarding participants’ general attitude towards OS, to determine the respective differences across various grouping variables and was, again, available only for respondents indicating that they had heard about OS (76%). Out of those, 90% (84% of the whole sample) agreed that OS “is generally a good thing”. We found no significant differences between any of the variables’ groups; at least 80% of respondents in any European region, GDP and GERD as a % of GDP ranges, the field of study, and career stage, kept the same belief. The biggest difference, although not significant, was found for field of study: representatives of Interdisciplinary and Life Sciences showed the most positive attitude, while representatives of Physics and Chemistry the least.
The percentages were calculated based on the classification of respondents. Cumulative bar plots show the results for (A) all respondents, (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. Statistical differences were evaluated with a Chi-squared test - results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
We next explored which features and activities of OS were most important to respondents aware of OS, to determine which specific elements across different grouping variables are more appealing to researchers. We noticed relevant differences in the responses across the groups, the most significant in the region, GDP, and GERD as % of GDP; while attitudes towards the importance of reproducible research, and open and FAIR data varied the most (Figure 6, Table 1).
In general, respondents consider Open Access the most important feature of OS, followed by reproducible research and open and FAIR Data (Figure 6A). OPR in considered much less important. The rest of the features (Open Licenses for research outputs, Open Metrics and Impact, and Citizen Science) are not considered important by most of responders. Notably, reproducible research and open and FAIR data are considered much less important by respondents in eastern Europe (Figure 6B) and the countries within the 2,500-10,000 GDP rnage (Figure 6C) and less than 0.75% regarding GERD as % of GDP (Figure 6D). This was also the case for respondents in economics sciences (Figure 6E).
Respondents could choose more than one feature. Radar charts display the results in terms of the percentage of total respondents of each category. Statistical differences were evaluated with a Chi-squared test and results are shown in Table 1.
We next explored what advantages respondents who were aware of OS, felt were offered by practicing OS, in order to determine the main incentives. Looking across the entire cohort studied, researchers in Europe indicated that the most important advantage of OS was “greater availability and accessibility of research outputs” (51.8% of responders). At the same time, we found differences across groups, the most significant in the European region, GDP, and GERD as % of GDP (Figure 7).
“Greater availability and accessibility of research outputs” was less crucial for respondents in western Europe (Figure 7B) and the countries within the 40,000 - 55,000 GDP range (Figure 7C). “Greater reproducibility and transparency of research outputs” was much less important on average, but crucial for respondents in western Europe (Figure 7B), in the countries within the 40,000-55,000 GDP range GDP (Figure 7C), and in those spending the most on research (Figure 7D). Other options were considered much less important. Notably, the perceived importance of the “possibility of more transparent and rigorous peer-review process” tended to decrease with increased values of GDP (Figure 7C) and GERD as % of GDP (Figure 7D) and to grow with the career seniority (Figure 7F).
Respondents were asked to select only one main advantage from a standard list of answers or select the option “Other”. This question was not compulsory. Percentages were calculated based on the total number of respondents per category. Cumulative bars plots show results for (A) all respondents, or classified based on (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. Statistical differences were evaluated with a Chi-squared test; these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
This question explored the main concerns respondents (only those aware of OS) had around OS, in order to identify the hurdles behind OS uptake. Researchers in Europe indicated that the “emergence of low-quality and false science” was their main OS-related concern (32.9% of respondents). Importantly, we found differences across groups; the most significant in the European region, GDP, and field of study (Figure 8).
“Emergence of low-quality and false science” was particularly concerning for responders in northern and southern Europe (Figure 8B), the countries within the 55,000 or more and 25,000-40,000 GDP ranges (Figure 8C), and for respondents in chemistry, physics, and interdisciplinary (Figure 8E). It was less worrying for respondents in western Europe (Figure 8B), the countries within the 40,000 - 55,000 GDP range (Figure 8C), and for responders in social sciences and humanities and life sciences (Figure 8E). Notably, the popularity of this concern decreased with career seniority (Figure 8F). “Missing sufficient training, tools and infrastructures” was the primary concern for responders in western Europe (Figure 8B) and the countries within the 40,000-55,000 range (Figure 8C) and was more prevalent among respondents in information science and engineering, social sciences and humanities, and life sciences (Figure 8E). “Potential misuse of scientific research outputs” and “the public may misunderstand research outputs” were more concerning for respondents in eastern Europe (Figure 8B) and less wealthy countries in terms of GDP (the ranges below 25,000 - Figure 8C) and those spending the least on research (Figure 8D). This was also relevant for responders in economic sciences (Figure 8E). “More amount of work required from researchers” was less concerning for responders in northern and southern Europe (Figure 8B), the wealthiest countries in terms of GDP, (Figure 8C), and those in the 1.25-1.75% range (Figure 8D). This was also the case for responders in chemistry and economics sciences (Figure 8E). Importantly, the popularity of this concern increased with career seniority (Figure 8F).
Respondents were asked to select only one main concern from a standard list of options or to select the option “Other”. This question was not compulsory. Percentages were calculated based on the total number of respondents per category. Cumulative bars plots display results for (A) all respondents, or classified based on (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. Statistical differences were evaluated with a Chi-squared test; these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
In this section of the survey, we investigated how familiar researchers were with OPR and their attitudes towards it.
This question explored the general awareness regarding OPR to determine the most and the least aware groups across different dimensions. It was the “key” question unlocking the other questions from the “OPR section” for those who answered “Yes.” Most respondents had not heard of OPR (46% answered positively). We found differences across groups; the most significant in the GDP and European region (Figure 9).
Unlike the rest, most responders in western Europe (Figure 9B) and countries within the 40,000-55,000 GDP range (Figure 9C) were aware of OPR. Respondents in southern and eastern Europe (Figure 9B) and countries within the 25,000-40,000 GDP range (Figure 9C) and in those spending below 1.25% of GDP on research (Figure 9D) were the least aware, which was also the case for the respondents in physics and economic sciences (Figure 9E). Awareness of OPR tended to grow with career seniority (Figure 9F).
Respondents were asked if they have heard about OPR. Cumulative bars plots display percentages of awareness for (A) all respondents or based on (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. Statistical differences were evaluated with a Chi-squared test, these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
This question was available only for those who indicated that they had heard about OPR. It further explored respondents' experience passing through the OPR process to determine the level of engagement in this open practice from the author’s side. Only 22.6% of those who responded to this question had had this experience, and we found some differences among groups, the most significant in career stage and European region (Figure 10).
Experience passing through the OPR process significantly increased with career seniority (Figure 10F). The most experienced responders were in eastern Europe and the least in northern and central Europe (Figure 10B) and countries within the 10,000-25,000 GDP range (Figure 10C). This was also the case for researchers in physics and information science and engineering (Figure 10E).
Respondents were asked if any of their papers had been through OPR. Percentages were calculated based on the total number of respondents per category. Cumulative bar plots show the results for (A) all respondents, or classified based on (B) region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. This question was not compulsory. Statistical differences were evaluated with a Chi-squared test; these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
This question was again available only for those who indicated that they had heard about OPR. It further studied respondents' experience in OPR from the reviewer’s side. Even fewer respondents (17.6%) had had the experience of reviewing in OPR, and we found some differences among groups, the most significant in career stage and region again (Figure 11).
As in the previous question, the experience of reviewing research outputs openly significantly increased with career seniority (Figure 11F). However, unlike the previous question, the most experienced responders were in western Europe (Figure 11B) and in countries within the 40,000-55,000 GDP range (Figure 11C). The least experienced responders were in northern Europe (Figure 11B). Regarding the field of study, respondents in chemistry, physics, and social sciences and humanities had had the least experience (Figure 11E).
Respondents were asked if they have been a reviewer in open peer review. Percentages were calculated based on the total number of respondents per category. Cumulative bar plots show the results for (A) all respondents, or classified based on (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. This question was not compulsory. Statistical differences were evaluated with a Chi-squared test, these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
The next question sought feedback regarding participants’ attitudes towards OPR compared to the conventional one to determine the differences across various dimensions. This question was available only for respondents indicating that they had heard about OPR. We found no significant differences between groups in any of the studied variables. Overall, 41.8% of those who responded to this question thought that OPR is better, while 45.3% answered “Not sure,” which matches the lack of OPR-related awareness in general.
We next explored what advantages respondents, who are aware of OPR, felt were offered by practicing OPR to determine the main incentives. Looking across the entire cohort studied, researchers in Europe indicated that the most important advantages of OPR were “encourages reviewers to be more tactful and constructive,” and “improves communication and understanding between authors, reviewers, editors and the broader community in general” (both options gained 35.0% of votes of those who answered this question). We found differences across groups, the most significant in the field of study, European region, and GDP (Figure 12).
“Encourages reviewers to be more tactful and constructive” was selected less by respondents in eastern Europe (Figure 12B), the least wealthy countries in terms of GDP (Figure 12C), and those spending less than 1.25% of GDP on research (Figure 12D). This was also the case for responders in chemistry, interdisciplinary, and physics, while for respondents in environmental and geosciences and life sciences, this advantage was significant (Figure 12E). “Improves communication and understanding between authors, reviewers, editors and the broader community in general” was more important for researchers in northern Europe (Figure 12B) and those working in physics and interdisciplinary fields (Figure 12E). This advantage was less critical for responders in eastern Europe (Figure 12B), the least wealthy countries in terms of GDP (Figure 12C) and in those spending the least on research (Figure 12D). This was also the case for respondents in environmental and geosciences and economic sciences (Figure 12E). “Leads to more objective reviews” was much less critical for respondents in northern and southern Europe (Figure 12B) and those working in physics (Figure 12E). This advantage was more important for respondents in chemistry and economic sciences (Figure 12E). “Helps to detect reviewers' conflicts of interests” was most important for respondents in eastern and southern Europe (Figure 12B), the least wealthy countries in terms of GDP (Figure 12C) and in those spending less than 1.25% of GDP on research (Figure 12D). This was also the case for respondents in chemistry and social sciences and humanities (Figure 12E), as well as for senior researchers (Figure 12F). This advantage was less critical for respondents in central and western Europe (Figure 12B), in countries within the 40,000-55,000 GDP range (Figure 12C), and researchers in interdisciplinary and economic sciences (Figure 12E).
Respondents were asked to select only one main advantage from a standard list of answers or select the option “Other”. This question was not compulsory. Percentages were calculated based on the total number of respondents per category. Cumulative bar plots display the results for (A) all respondents, or classified based on (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. Statistical differences were evaluated with a Chi-squared test, these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
The next question explored the main concerns respondents (those aware of OPR) faced when practicing OPR to identify the obstacles in the OPR implementation. Researchers in Europe indicated that it “may disadvantage early-career researchers and be an advantage for established ‘big name’ researchers”. This was their main OPR-related concern (39% of responders), followed by “open reports may be less critical” (24%), “increased likelihood of reviewers declining to review” (21.8%), and “more amount of work required from reviewers” (10.8%). We found no significant differences between groups.
This question explored the level of perceived reward for peer reviewing among respondents, who were offered a 10-point scale for self-assessment, where the lowest mark (1) meant “not rewarded at all” and the highest (10) - “extremely rewarded.” The aim was to determine the most and the least rewarded groups (self-assessed) across different grouping variables. On average, the self-assessed level of feeling rewarded was 4.4.
The results are displayed as follows: (A) European region, (B) GDP, (C) GERD as % of GDP, (D) field of study, and (E) career stage. The graphs illustrate the mean and standard deviation of the responses from each group, where the y-axes represent the numerical rating scale and x-axes display the different categories assessed. General statistical significance was tested using the Kuskall-Wallis test and is indicated above each panel. Stars indicate significance for multiple comparisons using the Dunn’s test, with corrected p-values. * P<0.05, ** P<0.01, *** P<0.001, **** P<0.0001. The statistical test did not include the overall results as its main purpose was to detect differences between the different subcategories.
The number of answers to this question were substantially lower than the others (7.7% of responses), matching the generally low level of experience in OPR among European researchers. Hence, we abstained from performing analysis of these responses.
In this section of the survey, we investigated how familiar researchers were with Open publishing venues and their attitudes towards them.
We studied which factors respondents felt were decisive when choosing where to publish research outputs to identify desirable features of publishing venues. Most researchers in Europe still consider the journal impact factor the most important variable, followed by a high-quality peer review process. Moreover, we found significant differences across groups, especially in the European region and GDP (Figure 7, Table 2).
The results are displayed as follows: (A) results including all respondents, (B) region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. Respondents could choose more than one channel of preference. Radar plots display the results in terms of the percentage of total respondents of each category. Statistical differences were evaluated with a Chi-squared test and results are shown in Table 2.
This question explored the general awareness regarding open publishing venues to identify the most and the least aware group across different grouping variables. It was the “key” question unlocking the other questions from the “open publishing venues section” for those who answered positively. The majority of respondents were not aware (only 39.6% answered positively). However, differences across groups were found, the most significant in the GDP, region and GERD as % of GDP (Figure 15).
Abbreviations: Sc.: Science, Eng.: Engineering.
Respondents in western Europe (Figure 15B), in countries within the 40,000-50,000 GDP range (Figure 15C), and within the 1,25-1.75% range (Figure 15D) were the most aware of existing open publishing venues. Researchers in eastern Europe (Figure 15B), in the least wealthy countries in terms of GDP (Figure 15C) and in those spending below 1.25% of GDP on research (Figure 15D), were the least aware. Junior researchers were the least knowledgeable compared to their more senior colleagues (Figure 15F).
Percentages were calculated based on classification of respondents. Cumulative bar plots display the results for (A) all respondents, or classified based on (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. This question was not compulsory. Statistical differences were evaluated with a Chi-squared test; these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
This question was available only for those who reported being aware of open publishing venues. It further explored respondents' experience in OS practices to identify the level of engagement with existing open publishing venues. From the offered list of such venues, respondents were mostly experienced with PLOS One, but significant differences among groups, especially in the GDP, European region, and field of study were again found (Figure 16, Table 3).
Abbreviations: Sc.: Science, Eng.: Engineering.
Region | GDP | GERD | Field | Stage | |
---|---|---|---|---|---|
χ2 (4, N = 1141) | χ2 (4, N = 1162) | χ2 (4, N = 1162) | χ2 (7, N = 1150) | χ2 (2, N = 1124) | |
eLife | 38.6 | 58.95 | 24.97 | 66.96 | 3.96 |
elifesciences.org | [8.44e-08] | [4.82e-12] | [5.10e-05] | [6.06e-12] | [1.38e-01] |
F1000 | 39.2 | 31.09 | 19.44 | 20.47 | 9.97 |
f1000research.com | [6.35e-08] | [2.94e-06] | [6.44e-04] | [4.64e-03] | [6.82e-03] |
Frontiers | 42.15 | 41.34 | 33.46 | 49.69 | 10.1 |
frontiersin.org | [1.55e-08] | [2.28e-08] | [9.63e-07] | [1.66e-08] | [6.42e-03] |
MDPI | 17.65 | 21.39 | 22.59 | 42.3 | 8.66 |
mdpi.com | [1.44e-03] | [2.65e-04] | [1.53e-04] | [4.55e-07] | [1.32e-02] |
PLOS One | 39.92 | 38.55 | 33.27 | 54.06 | 9.87 |
plos.org | [4.50e-08] | [8.63e-08] | [1.05e-06] | [2.29e-09] | [7.21e-03] |
ScienceOpen | 1.07 | 3.64 | 1.97 | 10.99 | 2.46 |
scienceopen.com | [8.99e-01] | [4.57e-01] | [7.42e-01] | [1.39e-01] | [2.92e-01] |
Although some of the open publishing venues listed in the survey were more popular among respondents (Figure 16A), it is evident that generally, some groups of survey participants are much more experienced than others. Respondents in western Europe (Figure 16B) and countries within the 40,000-55,000 GDP range (Figure 16C) are the most experienced. In contrast, respondents in eastern Europe (Figure 16B), the least wealthy countries in terms of GDP (Figure 16C), and in those spending the least on research (Figure 16D) have much less experience. Regarding the field of study, experience with specific publishing venues might reflect their thematic focus. However, respondents in life sciences, interdisciplinary, and information science and engineering have more experience, while respondents in physics and economic sciences have less (Figure 16E).
Respondents were given a set of open journals from which they could select multiple venues. They were given the option to add “Other” venues, but only individual venues that accounted for more than 5% of the responders are shown. Radar charts display responses by (A) all respondents, or classified by (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. In total, 393 (34%) respondents answered this question. The results are shown in terms of percentage of total respondents of each category. Statistical differences were evaluated with a Chi-squared test and results are shown in Table 3.
This question was again available only for those who reported being aware of open publishing venues. We explored which features of open publishing venues were reported as being the most important to identify the main motivations underlying their use. “Ability to address a wider audience” and “potential scientific impact and citations” were reported as the most important motivations, and we found important differences between categories, especially in the GDP, GERD as % of GDP, and European region (Figure 17, Table 4).
Abbreviations: Sc.: Science, Eng.: Engineering.
Respondents were given a set of motivations and could select multiple options. Results refer to analyses including (A) all respondents or classified by (B) European region, (C) GDP, (D) GERD as % of GDP, (E) fields of study, and (F) career stage. In total, 429 (37%) respondents answered this question. Radar charts display the results in terms of the percentage of total responders of each category. Statistical differences were evaluated with a Chi-squared test and results are shown in Table 4.
As the survey was initially developed to inform the ORE team, we explored the awareness level of this project across Europe. At the time of the survey, only 19.9% of respondents had heard about the ORE platform. Despite the general low awareness level, we spotted only minor differences across categories, namely in European region and Career stage (Figure 18).
Respondents were asked if they have heard about the EC’s plans to establish ORE. Cumulative bar plots display percentage of reported awareness for (A) all respondents, (B) European region, (C) GDP, (D) GERD as % of GDP, (E) field of study, and (F) career stage. This question was not compulsory. Statistical differences were evaluated with a Chi-squared test; these results are shown below each graph. Degrees of freedom and sample size per categorization are shown between parentheses and p-values are shown between brackets.
Finally, the way research findings are disseminated is changing with digitalization and today's communication technology. Indeed, it was demonstrated that higher engagement of the public with research happens through non-traditional routes such as social media.32 However, how research should be disseminated outside the standard route (academic journals or conference presentations) is debated. In particular, the use of social media should not substitute any validation that is obtained by peer-reviews and adequate editorial checks. Envisioning OS as “research knowledge shared (…) through collaborative networks”3 and with the aim of analysing the publishing practices, a section called “Scientific Communication” was included in the survey, questioning the new routes used by researchers to communicate their results.
This question explored respondents' experience in science communication through social media to identify the level of engagement with the most popular online social media platforms. From the offered list of platforms, responders reported being mostly experienced with ResearchGate. Significant differences were found among groups, particularly in GDP, European region, and field of study (Figure 19, Table 5).
Abbreviations: Sc.: Science, Eng.: Engineering.
Unlike the others, respondents in eastern Europe (Figure 19B), in the least wealthy countries in terms of GDP (Figure 19C), and in those spending the least on research (Figure 19D) rarely use Twitter to disseminate their research outputs but use Facebook and Academia more often. Respondents in western and northern Europe (Figure 19B), in the wealthier countries in terms of GDP (Figure 19C), and in those spending more on research (Figure 19D) tend to choose Twitter more often. Using LinkedIn to disseminate research outputs is more prevalent in northern Europe (Figure 19B). Respondents in social sciences and humanities, and economic sciences use Academia and Facebook more, while Twitter was more prevalent among life sciences and interdisciplinary respondents (Figure 19E).
The results are displayed as follows: (A) results including all respondents, (B) European region, (C) GDP, (D) GERD as % GDP, (E) field of study, and (F) career stage. Respondents could choose more than one channel of preference. Respondents were given the option to add “Other” social media used for dissemination, however these responses are not shown as each account for less than 5% of responses. In total, 1017 (88%) respondents answered this question. Radar charts display the results in terms of percentage of total respondents of each category. Statistical differences were evaluated with a Chi-squared test and results are shown in Table 5.
This question explored the level of satisfaction in regards to research outputs dissemination through social media channels, in order to determine the most and the least successful groups in this regard. Respondents were offered a 10-point scale for self-assessment, where the lowest mark (1) meant “not satisfied at all” and the highest (10) was “extremely satisfied.” In general, respondents were moderately satisfied with disseminating their research through social media channels, and the reported average level was 5.3. We found no significant differences between groups.
Research activity is growing year-on-year, alongside a meteoric rise in publications and the number of journals.47,48 Despite widespread discussion of the importance of OS practices for transparency, reproducibility, and evaluation of research outputs, as well as increased access to knowledge, the deployment of OS tools and practices remains limited.16 In the present study we sought to explore if general awareness of and attitudes towards OS would provide a potential explanation for its limited use. We therefore studied if and how ECRs currently use OS, their general knowledge and attitudes of OS, and then focused on OPR and open publishing venues.
In summary, the results of the survey suggest that the awareness level in Europe regarding OS in general, and specifically among ECRs, is high, and views about it are generally positive: more than three-quarters of responders have heard about OS, and 90% of those agree that “it is generally a good thing” (Figure 3 and 5). Awareness about specific aspects of OS, such as OPR and open publishing venues is not so high though, with percentages under 50% for OPR and 40% for Open venues. The majority of respondents have not experienced OPR neither as authors or as reviewers, although career seniority was found to affect this. Responders report they still look at the Impact Factor when selecting a publishing venue. Importantly, significant differences were found in the awareness of and attitudes towards, various aspects of OS and science communication between researchers representing different European countries/regions, disciplines, career stage and scientific experience.
In regards to European regions, we spotted three main groups sharing roughly similar awareness levels and attitudes towards OS, as well as common science communication patterns: researchers in Western Europe - the most informed and positive group; researchers in Northern, Central, and Southern Europe - a moderately aware and positive attitude within this group, showing some minor differences; and researchers in Eastern Europe - the least informed and the most negative attitudinal group. Overall, opinions of researchers in Eastern Europe deviated the most from the rest of the respondents. One explanation is that this might be caused by these countries not being directly affected by EU policies, as well as being the lowest income area included in the study (GDP) and having the lowest level of investment in research (GERD & GDP). Another factor may be post-Soviet inertia (preserved negative and obsolete aspects of the previous period). This can manifest in a low level of replacement of aging human resources and outdated institutional mechanisms and equipment, which pose serious problems for the reforms in the research field such as OS principles implementation.49 This also correlates with a lower perception of academic integrity values in this region.50
Although we found no direct correlation between OS and OPR awareness levels and the country’s wealth (GDP) and GERD as % of GDP, these variables do correlate slightly with the positive attitude towards OS. Indeed, the least wealthy countries in terms of GDP and those spending the least on research, demonstrate the lowest general OS awareness/positive attitude level. However, the results show that the opposite is not the case: researchers in Western Europe, but not in the wealthiest countries in regards to GDP, are the ‘OS champions’ of the continent, with the highest level of awareness and knowledge in this field, representing the “frontier of knowledge“regarding various OS features, and acknowledging advantages of open publishing venues. Also, their primary OS-related concern is “missing sufficient training, tools and infrastructures” rather than “emergence of low-quality and false science”; the main concern across all the other regions. It is still hopefully to note that widespread concern of the other regions, is a practical concern and might already be based on experience, and is focused on the implementation OS rather, than questioning the very idea of it (Figure 8).
OA is considered the most important feature of OS by all the categories of responders, followed by reproducible research and open and FAIR data. OPR is considered much less important. Reproducible research is considered significantly less important by researchers in Eastern Europe and lower-income countries (even compared to the relatively low rating they gave to other OS features). This could be evidence of a weak research data management culture and may worsen the “reproducibility crisis” in these countries.51 Citizen science, open metrics and impact, as well as open licenses for research outputs are not considered important by all the categories of respondents. Notably, regardless of their category, the responders considered greater availability and accessibility or research outputs as the most important advantage of OS (which seconds the above statement about OA). The emergence of low-quality and false science was their main OS-related concern. A possible explanation of the above may be due to the recent awareness boost on OA, due to funders’ requirements in Europe. Notably, the Plan S principles, initiated by research funders (the cOAlition S), stated ten principles to make all scholarly publications mandatory accessible in Open Access journals, on Open Access platforms, or immediately available through Open Access repositories without embargo, and this to be effective by 2021. Therefore, Plan S may have contributed to improving the awareness towards the relevance of OA, and thus being the practice of OS most know among researchers. Another explanation maybe be the lack of knowledge about the meaning of other OS practices, which may be more evident with the number of respondents on OPR awareness and attitudes described in the next paragraph.
Researchers in western Europe are again, the most aware of and the most active in OPR (together with researchers in countries investing more in research and development). This is compared to other European regions where more than 50% of respondents had not heard of OPR (the lowest awareness level, around 30%, was in southern Europe), with less than 20% peer reviewing research outputs openly (the lowest involvement levels were in northern and southern parts of Europe). Similarly, to the above-mentioned OS awareness and attitude analysis, we found no direct correlation between the OPR awareness level and the level of a country's wealth (GDP) or expenditure on research and development (GERD as % of GDP), except in those countries with the highest GDP and GERD as % of GDP. All regions shared approximately the same views regarding the advantages and disadvantages of OPR, as well as its general advantage over the conventional closed peer review (almost 42% of those who responded to this question believed that OPR was better, while more than 45% were unsure, thus reflecting the lack of knowledge on OPR). The responders considered the encouragement of reviewers to be more tactful and constructive as well as in improvement of communication and understanding between authors, reviewers, editors and the broader community in general, as the most important advantages of OPR. The main concern was that OPR may disadvantage ECRs and be an advantage for established “big name” researchers.
In general, European researchers do not feel particularly rewarded for their work as peer reviewers, and this might lead to additional challenges in the implementation of OPR. Hence, further research on establishing effective reward systems for reviewers (both monetary and non-monetary such as informal recognition52 or turning peer review into a measurable research output Publons-style53) is needed with onward analysis of the reported negative effects of such rewards, for-instance discouraging the most motivated and competent reviewers.54
Evidence for a three-group pattern described above persists in regards to the awareness of and attitudes towards various publishing venues. When marking the most important factors to decide where to publish research outputs, researchers in western Europe once again demonstrated their commitment to OS and research integrity with a strong focus on high-quality peer review and OA, as well as noticeable demand for the rapidness of publishing and ability to publish all research outputs. Journal impact factors is still an important indicator for researchers in western Europe, but much less weighty than for their colleagues in northern, central and southern Europe. Researchers in these regions again has similar attitudes, with some differences. They are much less oriented towards OA (researchers in central Europe the most) and mainly focus on impact factor (researchers in southern Europe the most, while their colleagues in northern Europe the least). High quality peer review was also heavily weighted by these groups (researchers in northern Europe the most, while their colleagues in southern Europe the least), and rapidness of publishing and ability to publish all research outputs was much less valued.
The attitude of researchers in eastern Europe again deviate the most from the other regions; demonstrating a completely different set of priorities, such as indexing in major citation databases and publication fee, as well as paying significantly less attention to impact factors or high quality peer review, and OA. In addition, researchers in eastern Europe are the least motivated by features of open publishing venues and have incomparably less experience in using them compared to their colleagues in other European regions.
As most of the respondents in eastern Europe were from Ukraine, a view of the current state of the research publication infrastructure in this country helps explain their priorities. As of September 2021, according to Scimago, there were only 61 Ukrainian research journals indexed in Scopus (one of the two most important abstract and citation databases today55). These journals do not cover all research disciplines present in Ukraine and have relatively low impact indicators. For reference, Poland, a directly neighbouring state with a comparable population (44.4 million in Ukraine and 38 million in Poland as of 2019), had 451 journals indexed in Scopus as of September 2021.
The above analysis suggests an “evolution of needs and focus” regarding contemporary scientific publishing with the next successive levels: basic, competitive, and collaborative (Table 6). Each level reflects a particular stage of science communication infrastructure and related research culture development in a certain country or region, which defines a specific set of connected needs, goals, and success indicators.
By the “basic level,” we mean a relatively weak science communication infrastructure with a lack of quality publishing venues, which puts researchers' focus on participation, i.e., gaining the ability to use publishing venues, indexed in the most critical databases, for their research outputs dissemination. For example, the basic level can be currently observed in Ukraine. By the “competitive level,” we mean a relatively developed science communication infrastructure, characterized by a sufficient number of rival quality publishing venues, which put researchers' focus on competition, i.e., being able to get accepted in the most prestigious ones and thus prove their “excellence.” Countries on the basic level might be able to reach this level by developing their research infrastructures. For example, the “competitive level” can be currently observed in Italy and Spain. By the “collaborative level,” we mean a highly developed science communication infrastructure and research culture with a completely different set of priorities such as collaboration and societal impact, based on the principles of OS. Countries on the “competitive level” might reach this level by changing their research culture towards openness and collaboration and reforming research assessment systems while innovating their research infrastructure. As of December 2021, no country can be confidently listed here as an example of this level, but representatives of western Europe, such as Belgium and the Netherlands, are arguably the closest to it.
According to the current study, most European regions (western, northern, central, and southern Europe) are in transition from the competitive to collaborative levels. In this regard, an essential question for eastern Europe and developing countries worldwide emerges: what is the best way to reach the “collaborative level” directly from the basic one avoiding the competition based on various bibliometric indicators?
Several social media platforms are used to disseminate research outputs. ResearchGate was the most popular amongst all respondents, followed by Twitter in western and northern European countries, LinkedIn in northern European countries, Facebook in the eastern European countries, and Academia among researchers in the humanities and economics fields. Few researchers had reported using other online media for this purpose. ResearchGate was also the most popular platform for reaching out to other researchers for research-related advice; suggesting it to be a valuable forum for reaching scientists of any background. With social media growing more fashionable for both dissemination of research and researchers’ own branding, even in low and middle-income countries,56,57 these platforms represent a viable avenue for hosting and reinforcing OS practices.
The survey was generally disseminated among Eurodoc volunteers, who typically have an interest in European and international developments, and people interested in OS, meaning that the level of responders’ knowledge at the time of the survey might be somewhat higher than the average understanding amongst all researchers. The survey also lacked responses from certain countries, particularly in south-eastern Europe. This is not unexpected as Eurodoc has low representation in this area.
Given the field of studies, it is important to acknowledge the uneven distribution of respondents from various European regions among representatives of different research areas that significantly impacts the results. When analyzing the two biggest groups regarding the field of study (life sciences and social sciences and humanities, each consisting of 26.1% of all the responses), we spotted a substantial gap in several regions’ representation. For instance, among respondents from eastern Europe, only 23 (7.4%) were in life sciences and 80 (25.8%) were in social sciences and humanities. In contrast to this, among respondents from southern Europe 105 (33.9%) were in life sciences and 72 (23.2%) in social sciences and humanities, and among respondents from central Europe 82 (26.5%) were from life sciences and 64 (20.6%) from social sciences and humanities. For instance, regarding science communication through social media (Figure 19), the general pattern of ResearchGate popularity applies to all the categories. However, the most noticeable deviations (in the form of usage of other platforms) can be explained by the high percentage of researchers in eastern Europe among representatives of social science and humanities, and economic sciences (using Academia and Facebook more). Considering the above discussion of differences in awareness of and attitudes towards OS and online science communication among European regions, we recommend treating this section of the survey’s results with caution and organizing a new dedicated survey aiming at a more balanced geographical distribution or studying specific regions.
Regarding the career stage, there is a positive correlation between the scientific experience of responders and their awareness and positivity towards OS and OPR. The latter correlation is stronger due to the limited engagement of ECRs in peer reviewing of any type (Figure 2). The above result could be explained by elements of a researcher’s career stage (particularly their seniority, age group, participation in a Horizon 2020 project, and publishing/reviewing activities) being reflective of the researcher’s experience, such as sophistication in science communication, and more extensive collaboration network. On the other hand, this might indicate an insufficient OS promotion and training level among ECRs and their engagement in various open practices. Although the above correlation is clear, the primary target group of the survey was ECRs (920, 77.6% of respondents) and researchers in junior positions (815, 68.7%). On top of that, the biggest group of respondents in senior positions were from eastern Europe (37, 41.6%). These limitations suggest that further research could be conducted on the differences in attitudes towards OS and its components among various groups of stakeholders with different experience (ECRs, researchers in middle and senior positions, and research librarians being arguably the most knowledgeable group). Further research could help to identify the most appropriate focus points for training and support on OS and incentive/reward systems for researchers, as was stated in Eurodoc input on UNESCO open science recommendation, where we asked for investments to human, educational, and infrastructural resources58 and which was also mentioned in the latest version of the recommendation.
Finally, as this was an exploratory study, the statistical test procedures were not carried out in a hypothesis-driven manner (e.g. corrections for multiple comparisons), and interpretation of the results should bear this in mind.
In our study, we sought to gain new perspectives on OS knowledge and motivation for its adoption amongst researchers, using a range of different dimensions relevant to demographics and background. Our results point to awareness levels of OS depending more on the European region (with the Western countries leading) than on the wealth and research and expenditure of the countries. Moreover, a more positive attitude towards OS practices was observed in researchers with higher knowledge on OS. We, therefore, postulate a program of “evolution of needs and focus” based on the level of OS knowledge and practices, currently categorizing by basic level, competitive level, and collaborative level (Table 6). Consequently, another question arises: what will the science communication landscape look like across the academic community of a specific country/region or globally after the “collaborative level” is reached, and what new predictable challenges might arise? Perhaps this will lead to a renewed focus on competition, but with a completely different set of indicators oriented towards teams and societal engagement rather than individual researchers. Also, advanced technology might become a more frequent player with the introduction of machine-learning approaches to support and assess various research processes and outputs.
In light of the survey, which corroborates previous findings on the topic of OS practices and adds further, we hypothesize that lack of OS awareness and negative attitudes may have implications for suggested changes in the current scientific culture, which is more focused on quantitative metrics and outputs than on quality and integrity of research. It is evident that OS practices are not yet integrated in research assessment procedures,59 thus constituting a burden for researchers and not a motivation, given the effort and time required for the research process. However, there is still a lack of studies regarding OS specific practices, such as OPR, across countries. Our study has clear implications for the design of OS implementation programs by focusing on training and introducing changes in the rewarding system through the revision of research assessment culture and procedures. Future studies should also monitor the researchers' approach to OS and science communication, given the debate raised during the COVID-19 pandemic about the dissemination of scientific outputs through the social media, in particular in their early stages, i.e., through preprints and enrolling citizens and policy decision-makers in this process.
Zenodo: Eurodoc Survey on Publishing in Open Science 202033
https://doi.org/10.5281/zenodo.5460097
This project contains the following underlying data:
Zenodo: Eurodoc Survey on Publishing in Open Science 202033
https://doi.org/10.5281/zenodo.5460097
This project contains the following extended data:
Data are available under the terms of the Creative Commons Attribution 4.0 International.
We would like to thank the Eurodoc OS Ambassadors 2019 cohort, the ORE project team (and especially the communications group members) for contributing, commenting, and shaping our questions in the survey. Many thanks to the Eurodoc members and administration of 2020 for helping us reach out to the respondents, who we also thank for taking their time to answer the present survey. Finally, we would like to thank the F1000Research team for supporting the present study publication.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the work clearly and accurately presented and does it cite the current literature?
Yes
Is the study design appropriate and is the work technically sound?
Yes
Are sufficient details of methods and analysis provided to allow replication by others?
Partly
If applicable, is the statistical analysis and its interpretation appropriate?
Partly
Are all the source data underlying the results available to ensure full reproducibility?
Yes
Are the conclusions drawn adequately supported by the results?
Partly
Competing Interests: Being an Early Career Researcher, I often participate in activities with EURODOC and am a member of the EURODOC Research Integrity and Research Assessment working group. Consequently, I know and often interact with some of the authors of the manuscript. I discussed these ties with the editorial team at F1000 and we concluded that this conflict of interest was acceptable as long as I believed it would not influence my assessment of the manuscript. I believe that I performed this peer-review impartially and with the best of my knowledge regardless of my prior knowledge of the authors.
Reviewer Expertise: Research on research, research integrity, research assessments, publication ethics
Is the work clearly and accurately presented and does it cite the current literature?
Yes
Is the study design appropriate and is the work technically sound?
Yes
Are sufficient details of methods and analysis provided to allow replication by others?
Yes
If applicable, is the statistical analysis and its interpretation appropriate?
No
Are all the source data underlying the results available to ensure full reproducibility?
Yes
Are the conclusions drawn adequately supported by the results?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Open science specialist, climate scientist
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 1 22 Dec 21 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)