Skip to content

Politics

The Soft Disinformation Contagion

How we learned to reason poorly with accurate data.

· 7 min read
Group of adults and children in white T-shirts standing close together, looking down at smartphones in dim studio setting.
Pexels.

We live in a paradoxical informational landscape. Never before have citizens had access to such an abundance of data, expert commentary, real-time analysis, and historical context. The digital age promised a “Great Clarification,” where the democratisation of information would serve as a universal solvent for ignorance. But despite this unprecedented availability, it has rarely been so difficult to form a coherent understanding of what is actually happening in the world.

The problem is not ignorance in the classical sense. The contemporary public is not uninformed; on the contrary, it is hyper-informed. People consume an endless stream of charts, “explainer” threads, and long-form analysis. But instead of clarity, the cumulative result is often a paralysing confusion paired with a growing sense of moral certainty. This is not the confusion of someone who knows they lack information; it is the confusion of someone who feels deeply informed and yet cannot explain why reality consistently fails to behave as predicted. Something essential is missing. That missing element is not data; it is discernment.

When disinformation is discussed in public discourse, it is usually framed as a problem of lies—fake news, fabricated claims, or manipulated images spread deliberately to mislead. While that phenomenon exists, it is no longer the primary threat to collective understanding. The most pervasive form of disinformation today is far subtler and more effective because it does not rely on false data. Instead, it relies on flawed reasoning methods applied to real, verifiable information.

“Soft disinformation” can be defined as the systematic transmission of non-falsifiable reasoning patterns using accurate or verifiable information. Its defining feature is not what it says but how it teaches people to think. This distinction matters because there is no such thing as analysis without bias; every interpretation of reality begins with assumptions. The issue is not the presence of bias but whether the analytical method allows those assumptions to be challenged, revised, or falsified. Soft disinformation does not prohibit dissent; it renders dissent structurally irrelevant. The most effective disinformation does not lie; it teaches people how to reason poorly while feeling intellectually responsible.

The operational structure of soft disinformation is remarkably consistent. It begins with an implicit conclusion established in advance. Data compatible with that conclusion is selected, while variables that complicate the narrative are omitted or deprioritised. The result is presented as neutral, evidence-based analysis, often draped in the visual language of authority—charts, citations, and professional jargon. Statistics may be accurate and quotations may be authentic, but the manipulation occurs entirely in the assembly. Crucially, the method does not generate hypotheses that can fail. The conclusion is insulated from refutation by design.

One of the most counterintuitive aspects of soft disinformation is that higher intelligence and education do not necessarily protect against it. In many cases, they increase vulnerability—this phenomenon is often described in cognitive science as motivated reasoning or motivated numeracy. This research suggests that individuals with the highest mathematical and analytical skills are often the most likely to use those skills to rationalise a preferred conclusion rather than objectively analyse the data at hand.

Individuals trained to process complex information often rely on trusted analytical shortcuts. When those shortcuts are supplied by authoritative voices using familiar academic conventions, scrutiny decreases. Moreover, analytical sophistication can be repurposed defensively. Complex reasoning skills make it easier to rationalise a preferred conclusion after the fact, especially when the surrounding informational environment offers ample material for selective justification. High-IQ individuals are not necessarily more objective; they are simply better at constructing elaborate justifications for their existing biases. Soft disinformation does not target the ignorant; it targets the motivated.

Within this ecosystem, the “militant analyst” has become prominent. This is not a propagandist in the crude sense, nor necessarily an incompetent journalist. It is someone who has adapted their analytical behaviour to an environment that rewards certainty and penalises doubt. The militant analyst does not seek to disconfirm their framework; they seek to optimise it rhetorically. Data become ammunition rather than instrumental to discovery.

The New Information Wars
Generative AI, disinformation, and the dangerous temptation of benevolent censorship.

The current media system does not eliminate dissent; it amplifies it when it is spectacular, confrontational, or identity-based. What it punishes most harshly is grey, provisional, and revisable analysis. In such an environment, falsifiability does not disappear through censorship; it disappears through adverse selection. The analysts who thrive are those who can provide the most robust—meaning the least revisable—narratives. This creates a market where intellectual humility is a professional liability and performative certainty is the highest currency.

Soft disinformation spreads not because people are irrational, but because it aligns neatly with how human cognition conserves energy. Thinking is cognitively expensive; certainty is a survival mechanism. Humans are pattern-seeking creatures. When presented with correlated data framed within a compelling story, the mind readily infers causation. This tendency is intensified by the psychological need for narrative closure. Open-ended analysis requires sustained mental effort to sit with ambiguity, whereas closed narratives offer emotional relief by resolving tension.

Furthermore, moral certainty acts as a cognitive shortcut. When a conclusion is framed as ethically obvious, questioning the logic behind it begins to feel like a moral failing. Doubt becomes psychologically and socially costly. In a high-volume environment, methods that deliver emotionally satisfying conclusions with minimal cognitive strain are rewarded, regardless of their epistemic quality. Soft disinformation is persuasive not because it is deceptive, but because it is efficient.

We cannot ignore the role of algorithmic curation in this epidemic. Modern platforms are designed to optimise for engagement, and engagement is most reliably triggered by narratives that provide high levels of “narrative fit.” When an algorithm detects that a user responds to a specific methodological framing, it begins to filter the world to match that framing.

This leads to a state of epistemic closure. The user is never lied to; they are simply shown a version of reality where the “true” data points always align with their pre-existing reasoning patterns. This process effectively removes the friction of contradictory data, leading to a profound atrophy of the cognitive muscles required for discernment,

This dynamic is not abstract. It has already unfolded in domains we consider epistemically rigorous. One of the clearest examples is the replication crisis in the social sciences. For decades, accurate data from single, peer-reviewed studies were used to build sweeping cultural narratives. These studies were not fake; the statistical methods were applied correctly at the time, and the participants were real.

Because the narrative was framed as scientific, questioning it was treated as a denial of reality.

However, soft disinformation occurred in the assembly of these findings into public policy and media explainers. Negative results—studies that found no effect—were systematically under-represented through publication bias, often described as the file drawer problem. The public was presented with a curated body of evidence that appeared overwhelming but was, in reality, a selective sample of outliers. Because the narrative was framed as scientific, questioning it was treated as a denial of reality. This is the essence of soft disinformation: using a series of true points to draw a false line while insulating that line from scrutiny.

Speed, volume, and engagement are the primary metrics of success. Nuanced analysis that requires contextual buildup and explicit uncertainty is slow and performs poorly in attention markets. Predictability builds loyal audiences, while moral alignment outperforms epistemic humility. The system does not demand distortion; it merely rewards certain behaviours consistently enough that others disappear.

The result is a steady drift toward methods that feel analytical but are structurally closed. This degradation produces measurable effects. Public debates become brittle, and policy discussions are evaluated through moral alignment rather than outcome prediction. Analytical errors persist because they are never exposed by failure signals. When reality contradicts expectations, explanations are adjusted post hoc rather than frameworks being revised. The system becomes resistant to learning.

The most serious consequence of this process is the erosion of basic analytical competence at the population level. When information is consistently consumed as a closed narrative, several skills atrophy: the ability to distinguish data from interpretation, the capacity to identify causal versus rhetorical relationships, and the willingness to revise beliefs in response to new evidence.

Society does not become more ignorant; it becomes less capable of autonomous reasoning. People remain informed, but they lose the ability to tell whether a line of reasoning is well constructed. This brittleness accumulates quietly until institutional decision-making becomes systematically detached from feedback.

Perhaps the central problem of our informational age is not that we are misinformed but that we have normalised a way of explaining the world that sounds convincing while teaching little. The question is not whether we consume more analysis than previous generations—we clearly do. The question is whether we understand reality better or simply feel more certain about what we already believed.

The deepest damage is not believing something false. Human history is a catalogue of corrected errors. The deeper damage is losing the ability to tell when reasoning itself has gone wrong. Discernment is not an innate trait; it is a practised skill. It requires exposure to uncertainty and engagement with analytical failure. If discernment is to be restored, it will not come from fact-checking alone. It will require a renewed emphasis on methods that expose themselves to the possibility of being wrong. Until then, we will continue to live in an age rich in information and poor in understanding. That imbalance, more than any individual falsehood, is the silent contagion shaping our collective perception of reality.

Quillette invites thoughtful responses to its essays.
Selected responses are published once per week as part of a curated Letters to the Editor feature. If selected, letters appear under the contributor’s real name and may be edited for clarity and length.

To submit a letter for consideration, please email [email protected].