ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

Political bias in historiography - an experimental investigation of preferences for publication as a function of political orientation

[version 1; peer review: awaiting peer review]
PUBLISHED 24 Mar 2025
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS AWAITING PEER REVIEW

This article is included in the Research on Research, Policy & Culture gateway.

Abstract

Background

This study examines the influence of political preferences on historians’ assessments of the publishability of contemporary history abstracts by investigating whether historians favor abstracts that align with their political orientation.

Methods

In an online experiment, 75 historians evaluated 17 fictitious contemporary history abstracts from 17 pairs, each presented with either a progressive or conservative stance. The participants made initial intuitive assessments regarding the publishability of each abstract and later provided more considered responses, also rating their Feeling of Rightness (FOR) regarding their initial judgments.

Results

The results revealed a significant interaction effect between an abstract’s political stance and historians’ political orientation, consistent with the observation that right-wing historians prefer conservative abstracts, left-wing historians prefer progressive abstracts, and moderate historians show no preference for either. Overall, participants preferred progressive abstracts, largely reflecting a majority of left-leaning historians in our sample. Moreover, after reconsidering their responses and providing FOR ratings, participants’ initial decisions did not significantly change.

Conclusions

Our study suggests that political preferences influence research evaluations and are not diminished by more deliberate processing, as demonstrated through the case study of historians.

Keywords

political bias, historiography, publication bias, political orientation, meta-research, Publication, Research bias, Research integrity, Social science research ethics, Empirical studies of research ethics, Normativity in science and technology

Author Note

Correspondence concerning this article should be addressed to Louis Schiekiera, Clinical Psychological Intervention, Department of Education and Psychology, Freie Universität Berlin, Schloßstraße 1, 12163 Berlin, Email: l.schiekiera@fu-berlin.de

Introduction

In quantitative research fields, it has long been theoretically discussed, and indeed empirically observed with the term publication bias, that studies are favored based on factors other than research quality (Fanelli, 2010, 2011; Dirnagl & Lauritzen, 2010). While not explicitly framed as publication bias, several scholars in historiography have similarly highlighted distortions in the research landscape caused by non-quality-related study characteristics (Bhat et al., 2023; Lustick, 1996; McCullagh, 2000; Rummel, 2002). A key issue that is often discussed in this context is how the cultural, social, political, ethnic, or economic background of historians influences the selection of topics, methods, and data, as well as the interpretations and conclusions drawn from these data (Bhat et al., 2023; Keyzer & Richardson 2022; Kimball, 1984; Lustick, 1996; McCullagh, 2000; Rummel, 2002). This phenomenon, typically referred to as cultural bias (McCullagh, 2000; Rummel, 2002), suggests that historians’ own cultural perspectives shape their research focus, methodologies, and the selection and interpretation of historical events, potentially leading to skewed or incomplete historical narratives (Bhat et al., 2023; Lustick, 1996).

Political biases in history and science

A specific focus within the study of cultural biases in history has been the influence of political and ideological characteristics on the selection and interpretation of historical sources (Bhat et al., 2023; Kimball, 1984; Lustick, 1996; McCullagh, 2000). A classical survey by Kimball (1984) on ideology and attitudes towards causes of 20th-century US wars reported strong correlations between historians’ ideological standpoints and their explanations of historical events.

Scholars who argue that such biases are problematic for historiography emphasize that historians have a social responsibility to produce accurate and fair descriptions and explanations of historical events; they contend that political biases can distort these descriptions and explanations and undermine the integrity of historical scholarship (Keyzer & Richardson, 2022; McCullagh, 2000).

Conversely, historians like White (1973) and Jenkins (1991) have emphasized the notion of history as a subjective scientific endeavor, emphasizing the role of political orientation and ideology in shaping historical narratives and interpretations. In a similar vein, Kimball (1984) suggests that ideological labels like conservative or liberal represent different “styles of thinking” rather than “unchangeable, structural, internal biases” (p. 370). We acknowledge, first, that history cannot be understood as an exact science, despite early claims to the contrary (Woods, 1911). Second, drawing on the sociology of knowledge, we recognize that human knowledge and attitudes are situated within and influenced by specific social, cultural, and political contexts (Berger & Luckmann, 1966; Simandan, 2019). However, we suggest that the general consideration of preferring scientific texts that align with one’s political orientation as non-problematic falls into the relativist trap of overlooking two important aspects.

First, this consideration does not acknowledge central achievements of modern history in overcoming biases, such as the fact that “until comparatively recently, Western historiography has generally been written from the perspective of white, Christian, upper-middle-class males” (McCullagh, 2000, p. 64), while the history of people of other ethnic and social groups and genders had been overlooked for centuries (Bhat et al., 2023).

Second, it ignores the argument that political censorship and self-censorship represent structural and internal biases which threaten the integrity of science (Clark et al., 2023; Duarte et al., 2015). For example, in the German state of Bavaria, a recent prohibition was placed on the use of gender-equitable language by state employees, including scientists (Deutsche Welle, 2024).1 In the United States, meanwhile, several state governments have enacted laws banning the teaching of critical race theory (Lukianoff et al., 2021). Moreover, restrictions on academic freedom imposed by the Hungarian government led the Central European University to relocate to Austria (Barabási, 2017). However, scholarly activity is not only censored by state interventions. Rather, (self-)censorship is common within the academic sector itself. Social psychologists have recently observed an increasing prevalence of self-censorship in academia, driven by prosocial motives, with scholars fearing that their research might be misused by malicious actors to justify harmful policies and attitudes (Clark et al., 2023). A common phenomenon in this context is objections to information that portrays historically disadvantaged groups unfavorably (Clark et al., 2023; Duarte et al., 2015). This is especially crucial in the field of historiography, where the representation of historical events and figures can be highly contentious (Rummel, 2002). Scholars may avoid or downplay findings that might be perceived as controversial or politically sensitive, leading to a distorted or incomplete understanding of history (McCullagh, 2000). The favoring of historiographical texts that correspond to one’s political orientation, while rejecting those that conflict with one’s political views, may have comparable consequences to those of publication bias within the quantitative sciences.

Decision-making and political biases

While Kimball’s survey aimed to describe the relationship between historians’ ideological standpoints and their attitudes towards historical events (Kimball, 1984), experimental studies on decision-making in historical research are lacking. However, such studies are crucial for understanding how these preferences influence scholarly actions such as publication decisions, peer review, or grant allocation. In an experimental study in the field of psychology, Abramowitz et al. (1975) presented psychology reviewers with two versions of a scientific manuscript, which were identical with the exception that references to student activists and nonactivists were interchanged and asked the participants to assess the publishability of the manuscript. The results revealed that reviewers’ political orientation influenced their decisions to publish the manuscript. However, Abramowitz et al. (1975) only investigated the rating of one experimentally varied manuscript, which is a clear limitation of their study.

Furthermore, the dynamics of decision-making in the context of publication decisions in historiography remain underexplored. Dual process theories (DPT) offer a theoretical framework to understand the relationship between fast, automatic judgment processes (Type 1) and more deliberate, analytical judgment processes (Type 2) (Evans, 2006; Kahneman, 2003). To investigate the reliance on either Type 1 or Type 2 processing during decision-making, researchers have used two-response procedure experiments (Thompson et al., 2011; Thompson & Johnson, 2014). In these experiments, participants typically provide three evaluations of a presented stimulus. First, they give a quick, intuitive response, which reflects Type 1 processing (Thompson et al., 2011). This is followed by an assessment of the Feeling of Rightness (FOR) regarding the initial response, defined as the “degree to which the first solution that comes to mind feels right” (Ackerman & Thompson, 2017, p. 608). Finally, participants offer a second response, during which they are allowed ample time to reconsider their initial answer and provide a more deliberate response, indicative of Type 2 processing (Thompson & Johnson, 2014). In the present study, we used this two-response paradigm to analyze historians’ decision-making in view of their political orientation.

Present study

The present study aimed to examine whether historians prefer contemporary history abstracts that align with their political orientation. Therefore, we conducted an online experiment in which fictitious contemporary history abstracts were presented in a within-subjects design with political stance as the experimental treatment variable. Our nested within-subjects linear mixed model design allowed us to isolate the effect of the experimentally varied treatment variable: the political stance of an abstract. In addition, we investigated decision-making processes using the aforementioned two-response procedure based on DPT (Kahneman, 2003; Thompson et al., 2011). Specifically, we examined the interaction effects between an abstract’s political stance and historians’ political orientation with respect to the following response variables: the intuitive likelihood of submitting an abstract for publication (Type 1 processing), changes between Type 1 and the more deliberate Type 2 processing, and the accompanying FOR of intuitive evaluations. In summary, this study addresses three gaps in the current literature: (A) experimental within-subject design studies on political bias in contemporary history, (B) experimental research on non-publication in contemporary history, and (C) the role of decision-making processes in historical publication activity.

Hypotheses

H1: Intuitive Responses

First, we assumed that abstracts that do not align with researchers’ political orientation often receive negative assessments during Type 1 responses, where rapid and automatic processing occurs. This mode of processing relies on mental shortcuts and pattern recognition (Thompson et al., 2011; Thompson & Johnson, 2014). From our perspective, individually undesired political positions may be perceived as salient patterns within research abstracts and papers, creating the impression among researchers of diminished publication and reception success. To explore this further, we formulated Hypothesis 1 (H1), which investigates intuitive responses regarding the intuitive likelihood of submitting an abstract for publication (ILoS).

H1. Intuitive Responses: Historians report a higher intuitive likelihood of submitting an abstract for publication if their political orientation matches the political stance of the abstract.

H2. Response Change from Type 1 to Type 2 Processing

Second, we assumed that abstracts that do not align with researchers’ political orientation are evaluated less negatively during Type 2 compared to Type 1 processing, as heightened and purposeful cognitive processing enables initially biased decisions to be refined (Thompson et al., 2011; Thompson & Johnson, 2014). When engaging in deliberate thinking, researchers may become more conscious of heuristics such as their political preferences in scientific decision-making, prompting them to reassess their initial negative responses (Colombo & Steenbergen, 2020). Thus, Hypothesis 2 (H2) examined changes in the likelihood of submitting an abstract for publication (ΔLoS), defined as the difference between intuitive responses and the considered likelihood of submitting an abstract for publication (CLoS).

H2. Response Change from Type 1 to Type 2 Processing: During the more deliberate and reflective Type 2 processing, historians report a higher likelihood of submitting an abstract for publication, despite a conflicting political stance, as compared to during their initial intuitive Type 1 assessments.

H3. Feeling of Rightness

Third, we assumed that abstracts that do not align with researchers’ political orientation are linked to lower FOR judgments after the intuitive processing. This assumption derives from experimental observations indicating that lower FOR judgments are more likely to be triggered by conflict stimuli, which induce cognitive conflict in decision-making, compared to non-conflict stimuli (Thompson & Johnson, 2014, p. 226). Additionally, we formulated Hypothesis 3 (H3) to explore the role of FOR in decision making.

H3. Feeling of Rightness: Historians provide lower Feeling of Rightness ratings for abstracts with a political stance that conflicts with their political orientations.

Methods

Open practices

Our study was preregistered on the Open Science Framework (OSF). Since self-reported political orientation measures can be considered as sensitive information, we will not publicly disclose the data for this study.

Procedure

The experiments were programmed in the open source JavaScript library jsPsych (De Leeuw, 2015), an open-source JavaScript library. The source code is available at GitHub and an archived version can be accessed via Zenodo. jsPsych is licensed under the MIT License, an OSI-approved open-source license. The experiments were run on the platform pavlovia.org and appropriate credits were purchased to store data and comply with the platform’s usage policies. The experimental procedure is depicted in Figure 1. During the experiments, participants first read a set of instructions, in which they were told that the study aimed to explore how historiographical articles are read and assessed. Participants were instructed to imagine that the study was part of their routine work at their university. They were told that the abstracts provided were written by colleagues, and they needed to decide which papers should be submitted for publication based solely on these abstracts. Due to limited capacity, participants could only choose a certain proportion for submission. Moreover, they were informed about the study’s focus on decision-making behaviors during manuscript preparation and publication and told that they would evaluate prototypical historiographical abstracts based on reliable historical sources.

3954f849-2a35-4987-a175-949b9dbc015b_figure1.gif

Figure 1. Experimental procedure.

Note. PO = self-reported political orientation.

The participants were then presented with fictitious contemporary history abstracts. In total, there were 34 abstracts, consisting of 17 pairs. The abstracts in each pair were identical with the exception that the political stance (progressive vs. conservative) was systematically varied. Each participant was presented with 17 abstracts, with only one example (i.e. progressive or conservative) from each pair. Of these, participants were presented with 8 conservative abstracts and 9 progressive abstracts. The selection of abstracts to be presented was randomized and balanced using jsPsych’s jspsych.randomization module (De Leeuw, 2015).

We adapted the two-response procedure for the experimental block as follows: Participants were instructed to provide an initial, intuitive response to the stimulus (Type 1). To maximize the probability that participants would provide the first answer that came to mind, they were told that this response should be given quickly, intuitively, and with a minimum of thought. This was followed by an assessment of FOR, after which participants were allowed as much time as needed to reconsider their initial answer and provide a final answer (Type 2). Participants also had the option to re-read the abstract. The reaction times for Type 1 and the change in the decision in Type 2 were recorded.

Upon completing their ratings of the 17 abstracts, participants were asked to complete a series of questionnaires. First, they provided basic sociodemographic information, including age, gender, and academic position. Next, participants were asked about their political orientation, ideological beliefs, and their views on the relationship between ideology and history. This was followed by a second sociodemographic questionnaire that collected data on their country of residence, employment status, and scientific publication activity. Participants were also asked to specify their research focus, such as their subdiscipline and historiographical school. Finally, they completed the Conscientiousness subscale of the BFI-2 personality inventory.

Participants

Sample Recruitment Process

In a meta-analysis, Ditto et al. (2019) found a mean overall partisan bias effect size of r = .245, with similar correlations for progressives (r = .235) and conservatives (r = .255). We took this effect size as the guideline for our power analysis. Converting the correlation coefficient to Cohen’s d resulted in d = 0.505. Next, we conducted a power analysis (Westfall, Kenny & Judd 2014; see the OSF project for further details) with an expected effect size of d = 0.505 using a sample of N = 75 participants and N = 17 stimuli per experiment, which resulted in an estimated power of 0.807.

From April to August 2024, contact information of academics who published articles on history studies was extracted from Web of Science and OpenAlex. Author names and email addresses were matched, and personalized invitations were sent to historians to complete the survey. A total of 7,063 historians were contacted, of whom 299 started the survey (response rate of 4.2%). However, 224 of these participants did not complete the experiment. Thus, responses from a total of 75 participants who completed the study were included in the analyses (response rate 1.1%). One additional run was conducted by the researchers themselves to troubleshoot an issue with data collection on the Pavlovia platform. The average time needed to complete the experiment was 26.25 minutes (SD = 13.51). In total, the study encompassed 1,275 trials (75 participants × 17 abstracts).

Historians were recruited globally, with invitations sent to researchers from various countries and regions without any geographical restrictions. To be eligible for participation in the study, historians needed to be currently employed in a history-related research group (including PhD/doctoral scholars). Furthermore, since the abstracts were presented in English, participants needed to understand academic English and feel comfortable reading English-language abstracts. Participants did not receive any reimbursement for taking part in the study.

Participant Consent

Prior to participation, participants provided explicit consent through an online consent form embedded in the experimental interface. This consent was captured electronically, as participants clicked “I agree to the terms and conditions of participation and would like to take part in the study.” This electronic consent process was reviewed and approved by the ethics committee.

Participants were presented with comprehensive information about the study, including:

  • 1. The purpose of the research (evaluation of historiographical abstracts for decision-making behaviors in publication).

  • 2. Anonymity of responses and secure storage of anonymized data for a minimum of 10 years.

  • 3. Assurance that participation was voluntary and could be terminated at any time without consequences.

Materials

A total of 17 case pairs of short contemporary history abstracts were generated. Each pair consisted of two abstracts that were identical with the exception of the political stance. Table 1 shows two example pairs of abstracts, on terrorism in Italy in the 1970s/1980s and on Brazilian trade unions during the 1960s. A summary of all 17 abstract pairs used in this study is provided in Table 2, and all abstracts are provided in full in the project’s OSF repository. The political stance in each abstract pair was manipulated through the content and focus of the abstracts. For example, abstracts categorized as “progressive” emphasized left-leaning topics such as social justice, postcolonialism, feminist values, and the need for societal reform. In contrast, “conservative” abstracts focused on themes like tradition, national pride, economic stability, and the preservation of established systems (Jost et al., 2003; Kandler et al., 2012).

Table 1. Example abstracts with varied political stance.

TopicPolitical StanceAbstract
Terrorism in Italy in the 1970s/1980sProgressiveThis paper revisits the 1980 Strage di Bologna, a bombing at Bologna Central Station that resulted in 85 deaths and over 200 injuries, an event orchestrated by the neofascist terror organization Nuclei Armati Rivoluzionari. The paper demonstrates how this event highlighted the dangers posed by domestic neofascist terrorist groups and their impact on Italian society. By analyzing legal and parliamentary documents, this paper traces a genealogy of legal and political reforms to combat right-wing extremism and terror in Italy.
ConservativeThis paper revisits the 1978 kidnapping and murder of Aldo Moro, the former prime minister and then-president of the Democrazia Cristiana party, an event orchestrated by the communist terror organization Brigate Rosse. The paper demonstrates how this event highlighted the dangers posed by domestic left-wing terrorist groups and their impact on Italian society. By analyzing legal and parliamentary documents, this paper traces a genealogy of legal and political reforms to combat left-wing extremism and terror in Italy.
Brazil’s Trade Unions in the 1960sProgressiveFollowing World War II, Brazil witnessed remarkable economic growth, ascending to the tenth position among the world’s largest economies by 1960. Despite this, data indicates a disparity between real wages and productivity growth starting in 1956, a deviation from the successful “social compact for growth” model seen in Europe and Japan’s “golden age”. A key reason was that in Brazil right-wingers oppressed the influence of trade unions and pushed an agenda of economic reform that had long-lasting detrimental societal effects.
ConservativeFollowing World War II, Brazil witnessed remarkable economic growth, ascending to the tenth position among the world’s largest economies by 1960. Despite this, data indicates a disparity between real wages and productivity growth starting in 1956, a deviation from the successful “social compact for growth” model seen in Europe and Japan’s “golden age”. A key reason was that in Brazil left-wingers controlled the main trade unions and pushed an agenda of social reform that had long-lasting detrimental economic effects.

Table 2. Overview of all abstract pairs presented during the experiment.

No. TopicProgressive AbstractConservative Abstract
01Terrorism in ItalyStrage di Bologna conducted by right-wing terroristsAldo Moro kidnapping conducted by left-wing terrorists
02Humboldt ForumMissed decolonial reinterpretation of the pastNecessary reclamation of Germany’s Prussian past
03Industrialization in PeruFocus on factory workersFocus on industrialists
04Benin BronzesPro restitutionContra restitution
05Sino-Vietnamese WarSubaltern studies perspectiveHigh politics perspective
06Al-Shifa HospitalContra bombing: humanitarian impactPro bombing: strategic rationale
07The Bell CurveCritical of “The Bell Curve”Supportive of “The Bell Curve”
08Coup in ChileIllegitimate concerns: US influence made coup possibleLegitimate concerns by the US
09Southwestern USFocus on environmentalistsFocus on industrialists
10Brazil’s Trade UnionsDetrimental effects by right-wingers Detrimental effects by left-wingers
11Iceland PoliticsCritical of left SDPCritical of conservative IP
12South Korea EconomicsDeregulatory economics precede the crisisRegulatory economics precede the crisis
13British EmpireLegacies of colonial inequalityLegacies of a benign modernizing force
14Dutch Economic PolicyPro KeynesianismContra Keynesianism
15Indian MediaFocus on Chipko (ecofeminists)Focus on RSS (Hindu Nationalism)
16History SubdisciplinesFocus on Social HistoryFocus on Military History
17Global Financial CrisisBlame of deregulationBlame of affordable housing policies

Some abstracts were influenced by real historical studies, books, or talks, such as Kim’s (2005) analysis of the political logic of economic crisis in South Korea, Harmsma’s (2023) exploration of a Keynesian intermezzo and neoliberalism in the Netherlands during the 1970s, Colistete’s (2007) study on productivity, wages, and labor politics in 1960s Brazil, Zimmerer’s (2017) discussion on the Humboldt Forum in German identity discourses, Kocka’s (1995) examination of leftist aspects in social history, and Ferguson’s (2012) book Empire: How Britain Made the Modern World, which provides a provocative reinterpretation of the British Empire. We utilized the assistance of ChatGPT-4 to generate further ideas for opposing views on historical controversies. Stimuli were quasi-randomly presented to participants (9 progressive and 8 conservative abstracts).

Measures

For each abstract, we measured the reading time, the response times (RT) for Type 1 and Type 2, the initial likelihood of submitting for publication (ILoS, Range = 0-100%), the FOR rating (1-7, with “When I gave the answer, I felt” ranging from “Very uncertain” to “Very certain”), the second, considered response regarding the likelihood of submitting for publication (CLoS, Range = 0-100%), and the difference between the two responses (ΔLoS, Range = 0-100%).

Political orientation was captured using the following single item: “In politics, people often talk about ‘left’ or ‘right’. Where would you place yourself on this scale from ‘left’ to ‘right’?” The item was rated on a 7-point scale, and we categorized ratings of 1-3 as left-leaning, 4 as moderate, and 5-7 as right-leaning (Inbar & Lammers, 2012). Despite the brevity of this measurement instrument, single-item political orientation measures are the most frequently used method to assess political orientation in political psychology (Krieger et al., 2019).

Further measures included participants’ age (indicated in ranges), gender, the country to which their respective research organization is affiliated, and their position at the research organization. We also collected data on their reviewer history (number of peer reviews conducted) and their authorship experience (number of papers authored as first and last author). Additionally, we measured participants’ familiarity with open science and publication bias (each rated from 1 = not at all familiar to 5 = very familiar) and the pressure they feel to publish their research (from 1 = not at all pressured to 5 = very pressured). Participants were further asked about their work in different areas of history and their historiographical school of thought based on questions from the classical study by Kimball (1984) and allowing for multiple responses. From the same study, we derived questions regarding ideological shifts over the past ten years, historians’ views on the correlation between specific ideologies and historical theories, and the belief that all historians have ideologies. Additionally, participants were presented with six items from the BFI-2-S conscientiousness subscale to investigate whether their considered ratings were driven by conscientiousness rather than the political stance of an abstract (Soto & John, 2017). The experimental HTML/jsPsych script including all items can be found in the project’s OSF repository.

Statistical analysis

Confirmatory Analyses

All hypotheses were analyzed using multilevel models. Our multilevel model (MLM) accounts for data being nested within both participants and abstracts, proposing that the likelihood of submitting an abstract for publication (LoS) is influenced by characteristics of both participants and abstracts. However, the structure of our experimental data is not entirely hierarchical; the experiment involves presenting the same 17 abstracts to all participants, with the abstracts only differing according to the political stance manipulation. This results in a cross-classified data structure within a counterbalanced design, which we address using cross-classified multilevel modeling (Hox, 2002). To investigate the impact of intuitive (H1) and considered (H2) LoS responses, as well as the FOR (H3) judgments relative to the participants’ political orientation and the political stance of the abstracts, we used the following general model with the lme4 (Bates et al., 2015) package for linear mixed models, using restricted maximum likelihood (REML) estimation:

response ~ political stance (abstract) + political orientation (individual) + political stance (abstract) * political orientation (individual) + (political stance (abstract)|participant) + (1|abstract)

This model decomposes the variation in LoS for participant j rating abstract i into random and fixed effects. The notation (political stance (abstract) | participant) represents (a) a random slope for participants based on the political stance of an abstract and (b) a random intercept at the participant level. In addition, (c) the term (1|abstract) represents a random intercept at the abstract level. The terms political stance (abstract) and political orientation (individual) capture the fixed regression effects, while political stance (abstract) * political orientation (individual) represents the interaction effect. In all models, progressive abstracts are coded as political stance (abstract) = -1 and conservative abstracts are coded as political stance (abstract) = 1.

Exploratory Analyses

We fitted an additional MLM (E1) for ILoS as the response variable to determine the influence of several covariates. First, we examined the influence of different geographical regions (dummy variables for Northern America, South/Central America, Asia, and Oceania) on the IloS variable, with Europe as the reference category. Therefore, we included additive terms for the four mentioned geographical regions in the MLM model.

Second, we investigated the influence of professional status group on IloS in MLM. We separated the categorical variable professional status group (levels: pre-doctoral level, post-doctoral level, professor level) into two dichotomous predictors: postdoc and professor. Hence, pre-doctoral level served as the reference category. We included additive terms for both status group predictors and the treatment in the MLM.

Third, we investigated whether fatigue influenced researchers’ decisions, as both motivation and concentration are likely to vary between the beginning, middle, and end of the evaluation of the abstracts. Therefore, we included a trial variable (numeric; values from 1-17; indexing when which abstract was presented) as an additive term in the model.

Fourth, we investigated whether the reading time of the abstract influenced historians’ decisions, since longer reading times might reflect increased cognitive engagement and deeper processing, potentially leading to more considered judgments or alterations in initial intuitive responses. We normalized the variable before including it in the model.

Integrating these analyses into the models resulted in the following lme4 model:

IloSpolitical stance(abstract)+political orientation(individual)+political stance(abstract) political orientation(individual)+northern_america+south_central_america+asia+oceania+postdoctoral+professorial+trial+reading_time+(political stance(abstract)|participant)+(1|abstract)

Finally, for ΔLoS, we investigated in a further model (E2) whether researchers’ tendency to rate an abstract again after providing an initial response is driven by conscientiousness rather than the political stance of the abstract. This disposition may stem from the inclination of individuals with a high level of conscientiousness to feel a stronger sense of obligation to alter their responses when prompted to reconsider the problem and provide a secondary evaluation. Therefore, we took the absolute value of ΔLoS (|ΔLoS|), as we were interested in whether conscientiousness influences the change in answers, regardless of the direction of the change. We introduced the BFI-2-S subscale conscientiousness as a covariate (conscientiousness) into the following lme4 model:

|ΔLoS|political stance(abstract)+political orientation(individual)+political stance(abstract)political orientation(individual)+conscientiousness+(political stance(abstract)|participant)+(1|abstract)

Further model specifications

We used R version 4.3.1 (R Core Team, 2023) for statistical analysis. The alpha level was set at 0.05. Normality of residuals as a function of model were examined for all MLM (Finch et al., 2019). Refer to the OSF repository for the R code and the QQ-plots.

Results

Descriptive statistics

Descriptive statistics of the sociodemographic data are shown in Table 3, and Table 4 depicts the differences in responses and reaction times between ILoS and CLoS. Table 5 contains the results of t-tests for differences between conservative and progressive abstracts for the various response variables. Figure 2 provides histograms and density plots for reading times and response variables, while Figure 3 shows the distribution of the political orientation variable.

Table 3. Sociodemographic data.

Category n (%)
Gender
Male50 (67%)
Female24 (32%)
Diverse1 (1%)
Age Group
18-251 (1%)
26-3513 (17%)
36-4530 (40%)
46-5516 (21%)
56 or over15 (20%)
Academic Position
Professor43 (57%)
Postdoctoral24 (32%)
Predoctoral8 (11%)
Location of Research Chair/Organization
Europe48 (64%)
Northern America12 (16%)
South/Central America9 (12%)
Asia3 (4%)
Oceania3 (4%)

Table 4. Paired two-sample t-tests for differences between intuitive and considered responses and reaction times.

M (Abstract)
VariableILoSCLoSDifference 95% CI t p
Response55.8355.380.46[-0.004, 0.92]1.94.052
RT (sec.)4.783.910.87[0.64, 1.10]7.36< .001

Table 5. Two-sample t-tests for differences between conservative and progressive abstracts for different response variables.

M (Abstract)
VariableProgressiveConservativeDifference 95% CI t p
ILoS57.2854.213.07[0.26, 5.88]2.14.032
CLoS56.6054.012.59[-0.25, 5.44]1.79.074
ΔLoS0.680.200.48[-0.44, 1.40]1.02.310
FOR5.065.08-0.02[-0.16, 0.11]-0.36.722
3954f849-2a35-4987-a175-949b9dbc015b_figure2.gif

Figure 2. Density plots and histograms for descriptive statistics of reading time and response variables of n = 1,275 trials.

Note. For the first subplot (top left), n = 10 trials represent outliers with reading times > 300 seconds, which have been removed from the plot to improve clarity and readability; Labels for the last subplot (bottom right): VU = Very Uncertain, U = Uncertain, RU = Rather Uncertain, NC/UC = Neither Certain Nor Uncertain, RC = Rather Uncertain, C = Certain; VC = Very Certain.

3954f849-2a35-4987-a175-949b9dbc015b_figure3.gif

Figure 3. Distribution of political orientation: Left – right orientation scale.

Note. n = 75 responses. The solid line demarcates the mean response of the sample; the dashed line depicts the median response; dotted lines indicate the standard deviation from the mean.

The majority of participants (88%) reported no change in their ideological views during the past decade. A small percentage reported a shift in their ideology, with 8% moving from left to right or from left to center, and 4% moving from right to left or from right to center. Most participants (64%) agreed that there is a meaningful correlation between specific ideologies and specific historical theories, 15% strongly agreed with this statement, 13% were neutral, and only a minority (8%) disagreed. A substantial proportion of participants (84%) agreed or strongly agreed that all historians have ideologies (agree: 45%, strongly agree: 39%), while a smaller proportion were neutral (4%) or disagreed (11%), and only 1% strongly disagreed. Further descriptive data regarding publication activity and research foci can be found in the OSF repository.

Confirmatory analyses

H1: Intuitive Responses, Participants’ Political Orientation, and Abstracts’ Political Stance

The total explanatory power of our main model for H1 was substantial (conditional R2 = 0.37); the explanatory power for the part related to the fixed effects alone (marginal R2) lay at 0.02. The treatment variable political stance (coded as 1 = conservative, -1 = progressive) was significantly associated with ILoS (b = -6.74, p < .001), indicating that a conservative stance was associated with a reduction in ILoS by 13.48 units (2 * 6.74) as compared to a progressive stance. Political orientation was not significantly related to ILoS (b = 1.94, p =.100), while the interaction effect political stance x political orientation significantly predicted ILoS (b = 1.82, p < .001). The model’s intercept, corresponding to political stance (abstract) = 0 and political orientation (individual) = 0, lay at 50.12 (p < .001). Figure 4 shows differences in the ILoS responses as a function of political orientation for both conservative and progressive abstracts.

3954f849-2a35-4987-a175-949b9dbc015b_figure4.gif

Figure 4. Intuitive likelihood of submitting for publication by political orientation and treatment.

Note. Error bars represent the standard error of the mean.

H2: Considered Responses, Participants’ Political Orientation, and Abstracts’ Political Stance

To analyze H2, which investigated the change in responses (ΔLoS) with respect to political orientation and political stance, we had to drop the random slope due to a singular fit of the random slope model. The resulting, more parsimonious random intercept model yielded weak explanatory power (conditional R2 = 0.03), with only a very small proportion attributed to the fixed effects alone (marginal R2 = 0.001). The treatment effect, representing the political stance of the abstract, was not statistically significant (b = -0.54, p = .345). Political orientation also did not significantly predict ΔLoS (b = -0.20, p = .840), and the interaction between political stance and political orientation was likewise not significant (b = 0.11, p = .547). The model’s intercept, corresponding to political stance (abstract) = 0 and political orientation (individual) = 0, lay at 0.57 (p = .547).

H3: Feeling of Rightness, Participants’ Political Orientation and Abstracts’ Political Stance

To analyze H3, which investigated the FOR responses with respect to political orientation and political stance, again, the random slope model had to be dropped due to a singular fit, leaving us with a random intercept model. This model showed substantial explanatory power (conditional R2 = 0.36), although the proportion attributed to fixed effects alone was minimal (marginal R2 = 0.002). The treatment effect, representing the political stance of the abstract, was not statistically significant (b = -0.05, p = .472). Political orientation also did not significantly predict FOR responses (b = -0.04, p = .581), and the interaction between political stance and political orientation was not significant either (b = 0.02, p = .334). The model’s intercept, corresponding to political stance (abstract) = 0 and political orientation (individual) = 0, lay at 4.18 (p < .001).

Exploratory analyses

Right-leaning historians rated conservative abstracts with a mean ILoS score of 65.6, which was significantly higher than their ratings of progressive abstracts, at 57.2, t(150.24) = -2.04, p = .043. Moderate historians rated conservative abstracts with a mean score of 55.6 and progressive abstracts with a similar mean score of 56.0, t(167.82) = 0.10, p = 0.918, indicating no significant difference. Left-leaning historians rated conservative abstracts with a mean score of 52.1, which was significantly lower than their score for progressive abstracts, at 57.5, t(927.24) = 3.31, p < 0.001. The ILoS ratings provided by left-leaning historians for conservative abstracts were significantly lower than the overall mean rating, t(447) = -3.11, p < .001. In contrast, right-leaning historians rated conservative abstracts significantly higher than the overall mean rating, t(71) = 3.3, p < .001.

In an exploratory analysis, we fitted an additional MLM (E1) to examine the influence of various covariates on ILoS. As in the model for H1, we observed non-significant effects for political orientation (b = 1.74, p = .157), but significant effects emerged for an abstract’s political stance (b = -6.71, p < .001) and for the interaction political orientation x political stance (b = 1.81, p = .001). However, none of the geographical region variables—Northern America (b = -4.86, p = .289), South/Central America (b = 0.78, p = .871), Asia (b = 10.91, p = .174), and Oceania (b = 0.92, p = .909)—exerted a significant effect on ILoS. Similarly, professional status variables—postdoctoral (b = -4.89, p = .393) and professor (b = -3.96, p = .463)—did not show significant effects on ILoS. Additionally, the effects of order of abstract presentation (b = -0.05, p = .649) and reading time (b = -0.07, p = .913) were not significant.

In a further linear mixed model (E2), we investigated the influence of conscientiousness on the |ΔLoS| ratings. As in the model for H2, the effects of political stance (b = 0.49, p =.297), political orientation (b = 0.05, p =.865), and their interaction (b = -0.21, p =.153) were not significant. However, the analysis revealed a statistically significant but negative effect of conscientiousness on |ΔLoS| ratings (b = -0.26, p =.005). Similar to the models for H1 and H2, the model’s singularity required us to drop the random slope at the participant level and, additionally, the random intercept at the abstract level.

Discussion

The present study offers new insights into the relevance of researchers´ political preferences for their decision-making in publication processes. Our results indicate that political preferences significantly influence historians’ intuitive assessments of the publishability of contemporary history abstracts. Specifically, historians demonstrated a preference for abstracts that aligned with their political views, and were more likely to deem them to be publishable in their initial judgments. This corresponds to the broader literature on political bias in academia, which has demonstrated that researchers tend to favor work that resonates with their own ideological leanings (Abramowitz et al., 1975; Inbar & Lammers, 2012; McCullagh, 2000).

On average, progressive abstracts were evaluated 3.07% more positively compared to conservative abstracts. This can likely be explained by our observation that, similar to other academic disciplines (Cardiff & Klein, 2005; Inbar & Lammers, 2012), historians tend to be more left-leaning on average than the general population (Gallup, 2024). In the present study, 75% of historians placed themselves in a left-leaning position, while 13% saw themselves as moderate and 12% as right-leaning. Notably, most of the mean scores for ILoS fell within the interval between 55.6 and 57.5, with the exception that left-leaning historians rated conservative abstracts significantly lower than average (M = 52.1) and right-leaning historians rated conservative abstracts significantly higher than average (M = 65.6). One potential explanation for historians’ preference for abstracts that align with their political views is confirmation bias, which refers to the tendency to select and interpret information such that it confirms one’s own expectations (Nickerson, 1998). Another explanation may lie in social identity theory, which suggests that individuals derive part of their identity from their group affiliations, including their political orientation, and this can influence their judgment of research that either affirms or contradicts their group’s political stance (Turner & Oakes, 1986).

Contrary to our expectations, the tendencies observed in the Type 1 responses did not significantly diminish during the more deliberate Type 2 processing. This finding suggests that even when historians have the opportunity to reconsider their initial judgments, they remain unchanged, highlighting the challenges in using reflection alone to mitigate the influence of aspects that are not inherent to or necessary for scientific quality.

Furthermore, the main effects of political stance and political orientation on FOR were not significant. FOR ratings may be more strongly associated with logical conflict stimuli (Thompson et al., 2011) rather than with potentially politically conflicting stimuli. The distinction between these two stimulus types may indicate that cognitive dissonance, triggered by logical conflict, plays a more important role in shaping FOR regarding the initial evaluation than does political congruence.

Our exploratory analyses suggest that geographical region, professional status, and the order of abstract presentation do not significantly influence historians’ evaluations, indicating that the influence of political preferences on scientific judgements is a pervasive phenomenon, which is not affected by additional factors such as geographical or professional context. Surprisingly, conscientiousness was negatively rather than positively associated with changes from the first to the second evaluation. It remains unclear whether this reflects less recollection and/or effort among less conscientious individuals in their ratings, leading to greater differences between the first and second ratings, or whether more conscientious individuals have greater confidence in their answers.

Scientific implications

The present study makes important contributions to the current literature on potential biases in historiography by addressing several important gaps. First, using an experimental design, we provide empirical evidence on how political orientation influences historians’ judgments about the publishability of research. This approach reaches beyond the theoretical discussions (McCullagh, 2000; Rummel, 2002), survey-based analyses (Kimball, 1984; Bhat et al., 2023), and qualitative studies (Keyzer & Richardson, 2022) in research on political bias, by utilizing a controlled setting to present diverse historical topics from progressive and conservative perspectives, thus isolating the effects of political preferences on scientific decisions.

Second, our study incorporates the dual-process theory framework, distinguishing between intuitive (Type 1) and reflective (Type 2) decision-making processes (Thompson et al., 2011). This distinction sheds light on how political preferences may operate differently depending on the level of cognitive processing, a dimension that has hitherto been underexplored in the context of scholarly activity (anonymized citation; will be added after peer review). The persistence of political preferences even during reflective stages of decision-making suggests the need for greater awareness of and possibly new strategies to mitigate such influences in historiography.

Third, our study raises important questions about whether such preferences might be considered as bias. If the acceptance/rejection of publications, topics, or analytical methods is influenced by aspects that are related not to study quality but to personal preferences, this can have critical implications. For example, in the quantitative sciences, the preference for positive and significant results leads to biased literature (Fanelli, 2010, 2011). Comparably, if publishability in historiography depends on political preferences, certain perspectives will be overrepresented while others are underrepresented, irrespective of the quality of the respective study, potentially skewing the historiographical record in favor of certain political perspectives while marginalizing others (Duarte et al., 2015). This could potentially lead to a file drawer problem (Rosenthal, 1979), similar to the non-publication of non-significant results, but specifically involving the suppression of politically unfavored articles or books from being published in prestigious journals or book series. Moreover, in the quantitative sciences, publication bias might not only encompass the preference for positive results but may also manifest as a preference for research that aligns with the scientists’, reviewers’ or editors’ political orientations (Abramowitz et al., 1975; Duarte et al., 2015; Inbar & Lammers, 2012), or as a predisposition to publish newsworthy results (Romero, 2017). Given these findings, our study can be viewed as a case study focused on historians; such an influence of political preferences is likely to be relevant in other academic disciplines, both qualitative and quantitative, and should be further investigated in those contexts.

Limitations

A first potential limitation of our study lies in the design of the abstracts used to assess political preferences. In each pair of abstracts, we focused on presenting two opposing perspectives on the same or similar historical events, with one abstract emphasizing a progressive stance and the other a conservative one. Such an approach may be problematic, because the classification of an abstract as progressive or conservative might be seen as culturally dependent. What might be considered as left or progressive in some countries might be seen as conservative or right-leaning in others (Hofstede, 2001). This variability suggests that political labels are not universally applicable and are influenced by the specific cultural and social context in which they are used (Norris, 2009). However, as we did not find significant effects regarding continent-dependent differences in our exploratory model (E1), it appears that cultural context did not exert a strong influence on differential perceptions of the political stance of the abstracts in our study. Furthermore, the selected events themselves might inherently carry different levels of emotional and political weight, which may have influenced participants’ judgments beyond the abstract content alone. For example, the Strage di Bologna, a tragic bombing attributed to the neo-fascist terrorist organization Nuclei Armati Rivoluzionari (De Lutiis, 1986), and the kidnapping and murder of Aldo Moro by the communist terrorist organization Brigate Rosse (Bocca, 1978) might evoke different emotional responses from participants that are not solely based on the political orientation of the abstracts. Despite these potential issues, we decided to use this approach to ensure that the abstracts remained grounded in real-world historical debates. This allowed us to create a more realistic experimental setting, even though some variability in participant responses may stem from the inherent characteristics of the historical events themselves rather than the political bias we aimed to investigate.

Second, we recruited only a small proportion of right-leaning (n = 9) and moderate (n = 10) historians. This raises concerns about the generalizability of our findings, particularly concerning conclusions drawn about right-leaning historians, and renders it difficult to confidently assert that the observed patterns of political preferences apply equally across the entire political spectrum. However, other studies on political orientation in psychology (Inbar & Lammers, 2012; Cardiff & Klein, 2005) and history (Kimball, 1984) found similar proportions of liberal/left-leaning and conservative/right-leaning scientists. Future studies should aim to include a larger and more balanced sample of historians with diverse political orientations to ensure that the findings are more broadly applicable and to better understand how political bias may manifest differently among right-leaning historians.

Third, while we investigated the impact of political preferences on historiography, focusing specifically on general political orientation, other studies on political bias in science also employed measures of political orientation with respect to foreign policy or economics (Inbar & Lammers, 2012). Future studies should incorporate further measures to provide a broader understanding of political preferences in historiographical research.

Finally, the external validity of presenting abstracts as a basis for making publication decisions might be lower than in field studies (Findley et al., 2021), as researchers typically have much more information available when deciding whether to submit their research for publication. This should be taken into account in future studies by incorporating more materials, such as referring to full manuscripts or more detailed summaries, to better reflect the information researchers would normally consider in their decision-making process.

Conclusion

This study reveals that political preferences significantly influence historians’ initial judgments of publishability for contemporary history abstracts, with a clear preference for abstracts aligning with their political views. Even when given the chance to reconsider, these influences largely persist, indicating that reflective decision-making may not fully mitigate their impact. The present findings suggest that such preferences are common across different regions and professional levels, underlining the need for greater awareness and consideration of how political perspectives might shape scientific evaluations. We critically discussed whether our results resemble a bias, comparable to publication bias. Overall, this research highlights the importance of recognizing and addressing political preferences in historiography to ensure the rigor and fairness of historical scholarship across the academic landscape.

Ethical approval

This study was approved by the Ethics Committee of the Faculty of Education and Psychology at Freie Universität Berlin. The project received approval on February 12, 2024. The approval number is 001.2024. The Ethics Committee determined that the research adheres to ethical guidelines and raised no ethical objections to the study. Participants were required to provide explicit electronic consent before participating, and the process for obtaining this consent has been described in detail, including its approval by the ethics committee.

Author contributions

  • Conceptualization: Louis Schiekiera, Helen Niemeyer

  • Data Curation: Louis Schiekiera

  • Formal Analysis: Louis Schiekiera

  • Funding Acquisition: Helen Niemeyer

  • Investigation: Louis Schiekiera

  • Methodology: Louis Schiekiera, Helen Niemeyer

  • Project Administration: Helen Niemeyer

  • Resources: Helen Niemeyer

  • Software: Louis Schiekiera

  • Supervision: Helen Niemeyer

  • Validation: Louis Schiekiera

  • Visualization: Louis Schiekiera, Helen Niemeyer

  • Writing – Original Draft Preparation: Louis Schiekiera, Helen Niemeyer

  • Writing – Review & Editing: Louis Schiekiera, Helen Niemeyer

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 24 Mar 2025
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Schiekiera L and Niemeyer H. Political bias in historiography - an experimental investigation of preferences for publication as a function of political orientation [version 1; peer review: awaiting peer review]. F1000Research 2025, 14:320 (https://doi.org/10.12688/f1000research.160170.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status:
AWAITING PEER REVIEW
AWAITING PEER REVIEW
?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 24 Mar 2025
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.