PHIL 452: Conspiracy Theories
Jennifer Saul
Estimated study time: 1 hr 10 min
Table of contents
Sources and References
- Keeley, Brian L. “Of Conspiracy Theories.” Journal of Philosophy 96, no. 3 (1999): 109–126.
- Clarke, Steve. “Conspiracy Theories and Conspiracy Theorizing.” Philosophy of the Social Sciences 32, no. 2 (2002): 131–150.
- Coady, David. “Conspiracy Theories and Official Stories.” International Journal of Applied Philosophy 17, no. 2 (2003): 197–209.
- Levy, Neil. “Radically Socialized Knowledge and Conspiracy Theories.” Episteme 4, no. 2 (2007): 181–192.
- Coady, David. “Are Conspiracy Theorists Irrational?” Episteme 4, no. 2 (2007): 193–204.
- Sunstein, Cass R., and Adrian Vermeule. “Conspiracy Theories: Causes and Cures.” Journal of Political Philosophy 17, no. 2 (2009): 202–227.
- Stokes, Patrick. “Conspiracy Theory and the Perils of Pure Particularism.” In Taking Conspiracy Theories Seriously, edited by M. R. X. Dentith, 25–37. Rowman and Littlefield, 2018.
- Cassam, Quassim. Conspiracy Theories. Cambridge: Polity Press, 2019. Chapter 1: “The Real Point of Conspiracy Theories,” 1–31.
- Dentith, M. R. X. “Conspiracy Theories on the Basis of the Evidence.” Synthese 196 (2019): 2243–2261.
- Nguyen, C. Thi. “Echo Chambers and Epistemic Bubbles.” Episteme 17, no. 2 (2020): 141–161.
- Napolitano, M. Giulia. “Conspiracy Theories and Evidential Self-Insulation.” In The Epistemology of Fake News, edited by Sven Bernecker, Amy K. Flowerree, and Thomas Grundmann, 82–105. Oxford University Press, 2021.
- Nguyen, C. Thi. “The Seductions of Clarity.” Royal Institute of Philosophy Supplements 89 (2021): 227–255.
- Duetz, J. C. M. “Conspiracy Theories Are Not Beliefs.” Erkenntnis (2022): 1–15.
- Shields, Matthew. “Rethinking Conspiracy Theories.” Synthese 200, no. 4 (2022): 331.
- Buzzell, Andrew, and Regina Rini. “Doing Your Own Research and Other Impossible Acts of Epistemic Superheroism.” Philosophical Psychology 36, no. 5 (2023): 906–930.
- Levy, Neil. “Conspiracy Theories as Serious Play.” Philosophical Topics 50, no. 2 (2023): 1–19.
Chapter 1: What Is a Conspiracy Theory?
The question of how to define “conspiracy theory” is not merely semantic housekeeping. The label carries enormous rhetorical weight: calling something a conspiracy theory can function as an epistemic dismissal, a way of signalling that an explanation need not be taken seriously. Before we can ask whether conspiracy theories are rational or irrational, harmful or harmless, we need a clear account of what we are talking about. Two foundational papers—Keeley (1999) and Clarke (2002)—offer complementary frameworks that have shaped virtually all subsequent philosophical debate.
1.1 Keeley’s Foundational Account
Brian Keeley’s 1999 article “Of Conspiracy Theories” is widely regarded as the paper that launched the contemporary philosophical study of conspiracy theories. Keeley draws an analogy with Hume’s treatment of miracles to frame the problem: whereas Hume argues that miracles are, by definition, the kind of explanation we are never warranted in believing, Keeley’s first major conclusion is that no such analytic argument is available for conspiracy theories. There is nothing in the concept of a conspiracy theory that makes it inherently irrational.
1.1.1 The Minimal Definition
Keeley’s definition has several important features. First, conspiracy theories are explanations—they are attempts to account for events in the world. Second, the agents involved are a relatively small group, not an entire society. Third, secrecy is essential: conspirators operate covertly, concealing their actions from the public and, typically, from official institutions.
1.1.2 Warranted vs. Unwarranted Conspiracy Theories
Keeley recognises that some conspiracy theories are true. The Watergate scandal, the Iran-Contra affair, and various instances of corporate malfeasance all involved genuine conspiracies. This means we cannot dismiss conspiracy theories as a class. The real question is how to distinguish warranted conspiracy theories (有根据的阴谋论) from unwarranted conspiracy theories (无根据的阴谋论).
While there is no a priori method for drawing this line, Keeley identifies a cluster of characteristics that tend to mark unwarranted conspiracy theories:
- They posit an ever-expanding circle of conspirators.
- They treat counter-evidence as evidence planted by the conspirators themselves.
- They require the conspirators to have near-superhuman competence and coordination.
- They tend to be unfalsifiable in practice, since any disconfirming evidence can be absorbed into the theory.
1.1.3 Monological vs. Mundane Conspiracies
A further important distinction in Keeley’s framework is between monological conspiracy theories (单因素阴谋论) and mundane conspiracy theories (普通阴谋论). Mundane conspiracy theories are limited in scope: they posit a specific group acting secretly to bring about a particular event. They are, in principle, fully falsifiable—investigators can check the relevant records, interview witnesses, and test the hypothesis against available evidence.
Monological conspiracy theories, by contrast, have an all-encompassing character. They aspire to explain not just a single event but a whole pattern of events, and they do so by reference to a master conspiracy operating across many domains over a long period. Because a monological theory ties so many phenomena together under a single explanatory framework, it becomes extremely resistant to falsification: any disconfirming event can be explained by positing that the conspirators arranged it to mislead investigators.
This distinction is epistemically significant. The move from mundane to monological conspiracy theorising is often gradual: an initially limited hypothesis expands to incorporate more and more events, pulling in more conspirators and developing increasingly elaborate auxiliary hypotheses. Recognising this trajectory is one of Keeley’s most practically useful contributions.
1.1.4 The Problem of Errant Data
A key concept in Keeley’s analysis is errant data (异常数据)—data that conflicts with the accepted or official explanation of an event. Conspiracy theories typically begin with errant data: anomalies, coincidences, or unexplained facts that do not fit the received account. Keeley notes that errant data is a legitimate starting point for inquiry. The problem arises when a conspiracy theory explains away all possible counter-evidence, effectively becoming immune to empirical refutation.
1.2 Clarke’s Revisionary Proposal
Steve Clarke (2002) approaches the definition and evaluation of conspiracy theories through the lens of Imre Lakatos’s philosophy of science, specifically the distinction between progressive research programmes (进步研究纲领) and degenerating research programmes (退化研究纲领). But Clarke also advances a revisionary claim about the very definition of conspiracy theories that distinguishes his account from Keeley’s.
1.2.1 Clarke’s Definitional Move
Where Keeley defines conspiracy theories minimally—as explanations appealing to secret group action—Clarke argues for a more substantive characterisation. On Clarke’s view, genuine conspiracy theories are those that posit unjustified agency (无正当理由的能动性) as the explanatory key. That is, what marks a conspiracy theory is not merely that it appeals to intentional action, but that it appeals to intentional action where the evidence does not warrant doing so. On this revisionary reading, a conspiracy theory is, by definition, an explanation that attributes causality to conspiratorial agents in circumstances where non-agentive or mundane explanations are better supported by the evidence.
This move has both advantages and disadvantages. Its advantage is that it immediately captures what is epistemically problematic about paradigm conspiracy theories: they attribute intentional agency unnecessarily. Its disadvantage—which Keeley and later Duetz will press—is that it makes “conspiracy theory” a pejorative by definition, thereby excluding in advance the possibility that some conspiracy theories are warranted.
1.2.2 Conspiracy Theories as Research Programmes
In Lakatos’s framework, a research programme is progressive if it generates successful novel predictions and retrodictions—if it leads investigators to discover new facts. A programme is degenerating when it can only accommodate new evidence by adding ad hoc auxiliary hypotheses, without making successful predictions.
Clarke argues that most conspiracy theories function as degenerating research programmes. When new evidence appears to disconfirm a conspiracy theory, adherents modify auxiliary hypotheses and initial conditions to protect the core thesis. Rather than predicting new facts, the theory merely absorbs whatever evidence appears. Clarke contrasts this with the Watergate investigation, which functioned as a progressive research programme: the journalists’ hypothesis that a conspiracy existed led them to make successful predictions about where to find further evidence.
1.2.3 The Fundamental Attribution Error
Clarke introduces a psychological mechanism to explain why people persist in holding conspiracy theories even when they degenerate. The fundamental attribution error (基本归因错误) is the well-documented tendency in social psychology to overattribute events to the intentional actions of agents rather than to situational, structural, or accidental factors. Conspiracy theories, Clarke suggests, are a natural expression of this cognitive bias: when something bad happens, we are disposed to look for someone who made it happen, rather than accepting that complex events can arise from the interaction of impersonal forces.
1.3 The Pejorative vs. Neutral Use of “Conspiracy Theory”
A recurring tension in the literature concerns whether “conspiracy theory” should be treated as a neutral descriptive term or as a term with inherent negative connotations. Understanding this tension matters because it shapes every subsequent epistemological and political question.
1.3.1 The Pejorative Use
In ordinary language and much public discourse, “conspiracy theory” functions as a pejorative (贬义词)—a term that, when applied to a claim, signals that the claim is not worthy of serious consideration. When a politician labels a journalist’s report a “conspiracy theory,” or when a corporation dismisses a safety concern as “conspiracy thinking,” the label is doing rhetorical work: it marks the claim as belonging to a category of utterances that rational people dismiss without further investigation.
This pejorative use has a history. Scholars have traced the rhetorical deployment of “conspiracy theory” as a silencing label at least back to the 1960s, when the CIA reportedly promoted the term as a way to discredit critics of the Warren Commission’s account of the Kennedy assassination. Whether or not this specific origin story is accurate, it illustrates that the label “conspiracy theory” is not epistemically neutral—it is embedded in political contexts and power relations.
1.3.2 The Neutral Use
Philosophers working in the minimalist tradition (Keeley, Coady, Duetz) insist on a neutral use of “conspiracy theory”—one that simply picks out a certain type of explanation without prejudging its epistemic status. On this view, any explanation that posits a small group acting secretly is a conspiracy theory, and whether that theory is warranted is a separate question to be settled by examining the evidence.
The neutral use has the advantage of avoiding question-begging: it allows us to acknowledge that some conspiracy theories (Watergate, COINTELPRO, MKUltra) are true and warranted, without having to tie ourselves into definitional knots. Its disadvantage is that it may obscure the genuine epistemic pathologies associated with the social practice of conspiracy theorising—pathologies that motivate the more evaluatively loaded definitions.
1.3.3 Why the Definition Matters for Epistemology
The choice of definition has direct downstream consequences for epistemology. If “conspiracy theory” is a neutral term, then the question “should we believe conspiracy theories?” is as unanswerable as “should we believe explanations?”—the answer is always “it depends on the evidence.” The interesting epistemological questions then concern what evidence bears on conspiracy theories and how we should evaluate it.
If, on the other hand, “conspiracy theory” is a term with built-in negative epistemic connotations, then the question “should we believe conspiracy theories?” has a more tractable answer: roughly, no, unless there are unusually compelling reasons to do so. The epistemological task then becomes identifying the special features of conspiracy theories that warrant this default scepticism.
1.4 A Taxonomy of Definitions
The following table summarises how four key theorists conceptualise “conspiracy theory.” Note that the differences between them are not merely definitional—they reflect deeper disagreements about what is epistemically and politically significant about conspiracy theorising.
| Theorist | Core Definition | Key Feature | Normative Status |
|---|---|---|---|
| Keeley (1999) | Explanation appealing to secret group action | Errant data; expandable conspirator network | Neutral; evaluate case by case |
| Clarke (2002) | Explanation positing unjustified agency | Degenerating research programme structure | Negative by definition |
| Coady (2003) | Explanation conflicting with official story | Relational; same content can shift status | Neutral; official stories equally suspect |
| Dentith (2019) | Explanation that posits a conspiracy | Particularist; evidence-based evaluation | Neutral; strong particularism |
1.5 Tensions in Defining Conspiracy Theories
The definitions offered by Keeley and Clarke raise a tension that runs through the entire philosophical literature. If a conspiracy theory is simply an explanation that appeals to a small group acting in secret (Keeley’s minimal definition), then many well-confirmed explanations—including official explanations of terrorist attacks—count as conspiracy theories. The term, on this reading, is epistemically neutral. But if we build evaluative criteria into the definition (as Clarke does by invoking degenerating research programmes), we risk begging the question against conspiracy theories from the outset.
This tension between minimalist definitions (最小化定义) and loaded definitions (负载定义) recurs in nearly every subsequent debate. Minimalists argue that we should define the term neutrally and then ask, as a separate question, whether particular conspiracy theories are rational. Advocates of loaded definitions argue that the neutral definition fails to capture what is distinctive and epistemically worrying about conspiracy theorising as a social practice.
Chapter 2: Conspiracy Theories and Official Stories
David Coady’s 2003 article “Conspiracy Theories and Official Stories” shifts the focus from the internal features of conspiracy theories to their relationship with the explanations endorsed by authoritative institutions. This reframing has profound consequences for how we evaluate conspiracy theorising.
2.1 Coady’s Relational Definition
On Coady’s account, what makes something a conspiracy theory is not its content (whether it posits a conspiracy) but its social position (whether it opposes the official account). This is a relational property: the very same explanation could be a conspiracy theory at one time and the official story at another, depending on which explanation the authorities endorse.
2.1.1 Coady’s Core Claim
Coady’s central positive thesis is that conspiracy theories, on his relational definition, should be understood as explanations that simply posit powerful agents acting secretly to bring about outcomes they publicly deny (权势人物秘密行动以实现其公开否认的结果). This formulation does three things. First, it highlights that conspiracies involve power—not just any group, but agents with resources, institutional access, and the capacity to cover their tracks. Second, it highlights secrecy—the conspirators are not merely acting privately but are actively suppressing public knowledge of what they do. Third, it highlights denial—the conspirators publicly repudiate the very actions they are performing, adding a layer of deception to the covert action itself.
Framed this way, Coady makes vivid that conspiracy theories are not exotic departures from normal political reality. Powerful agents routinely act secretly and publicly deny what they are doing. The history of government intelligence agencies, corporate cover-ups, and political machinations provides abundant confirmation of this pattern.
2.1.2 Warranted Conspiracy Beliefs: Historical Cases
Coady is insistent that the reflexive dismissal of conspiracy theories ignores a substantial historical record of real conspiracies that were, for a time, dismissed as the paranoid fantasies of conspiracy theorists. Three cases are particularly instructive:
Watergate (水门事件): The Nixon administration’s operation to spy on and sabotage political opponents, and the subsequent cover-up, was initially denied by the White House and dismissed by many commentators as excessive suspicion. The conspiracy was real, and those who believed it were vindicated. The lesson: institutional authority is no guarantee of truth.
COINTELPRO (反情报计划): The FBI’s Counterintelligence Program ran from 1956 to 1971 and involved systematic surveillance, infiltration, harassment, and disruption of political organisations, including the civil rights movement and anti-war groups. Participants in these movements who claimed they were being surveilled and subverted were dismissed as paranoid. They were not.
MKUltra (思维控制计划): The CIA’s mind-control research program, which included covert experiments on unwitting subjects, was vigorously denied for years. Congressional investigations in the 1970s confirmed its existence. Those who suspected the program were conspiracy theorists in the only meaningful sense—they believed a conspiracy was occurring—and they were correct.
2.1.3 Official Stories Can Themselves Be Conspiracy Theories
Coady points out a crucial irony: official stories often are conspiracy theories in the content sense. The official explanation of the 9/11 attacks, for example, posits a conspiracy by al-Qaeda. But no one calls it a “conspiracy theory” because it is the account endorsed by the relevant authorities. The label “conspiracy theory” is reserved for explanations that challenge official accounts. This reveals that the term functions less as a description of content than as a marker of social and epistemic marginality.
2.2 The Symmetry Argument
A central and distinctive contribution of Coady’s work is what we might call the symmetry argument (对称性论证): the claim that there is no principled epistemic asymmetry between conspiracy theories and official stories that would justify treating official stories as more credible by default.
2.2.1 Why the Asymmetry Is Assumed
The standard assumption is something like: official stories emerge from accountable institutions with professional obligations to accuracy, and are therefore more credible than unofficial explanations produced by individuals without comparable accountability. This assumption licenses a kind of epistemic privilege (认识论特权) for official stories—a presumption in their favour that places the burden of proof on those who challenge them.
2.2.2 Why the Asymmetry Fails
Coady challenges this assumption on several grounds:
Track record: Official sources have their own history of deception, error, and motivated reasoning. Intelligence agencies have fabricated evidence; governments have suppressed inconvenient findings; official commissions have been subject to political interference. The track record of official sources is not clean enough to warrant a strong prior in their favour.
Incentive structures: Official sources often have strong incentives to mislead—to protect political leaders, avoid legal liability, maintain public confidence, or pursue geopolitical objectives. These incentive structures are not obviously better aligned with truth-telling than those of unofficial critics.
The relational point: Since official stories are themselves sometimes conspiracy theories in content (they posit groups acting secretly), the epistemic privilege attached to official stories does not track a difference in kind between conspiracy theories and official stories, but merely a difference in who is doing the claiming.
2.2.3 Institutional Trust and Epistemic Privilege
Despite the force of the symmetry argument, it would be a mistake to conclude that official sources deserve no greater credibility than random claims on the internet. The philosophical question is not whether institutional trust is rational—it often is—but what grounds that trust and under what conditions it should be calibrated downward.
Coady’s more nuanced position is that epistemic trust in institutions should be calibrated to their demonstrated reliability in the relevant domain (相关领域的有据可查的可靠性). Public health institutions with strong track records of accurate reporting deserve substantial trust on public health questions; intelligence agencies with documented histories of manipulation deserve less trust on politically sensitive matters. The key is calibration, not blanket trust or blanket suspicion.
2.3 The Problem of When the Official Story Is the Conspiracy Theory
The relational character of Coady’s definition generates an interesting limiting case. If official stories are defined as what the relevant authorities endorse, and conspiracy theories are defined as what conflicts with official stories, then a conspiracy theory can become an official story and vice versa simply by a change in which groups hold power.
This is not merely hypothetical. In the Watergate case, the “conspiracy theory” that Nixon’s team had broken into the Democratic National Committee’s headquarters and then covered it up was, for a period, unofficial. Once the cover-up unravelled and the congressional investigation confirmed the conspiracy, this account became the official story—while the White House’s denials became the “conspiracy theory.” The epistemic content of the two accounts did not change; only their institutional status did.
This observation reinforces Coady’s central lesson: the label “conspiracy theory” tracks social and political position, not epistemic merit. An explanation’s social status tells us nothing, by itself, about whether we should believe it.
Chapter 3: Can Conspiracy Theorising Be Rational?
The question of whether conspiracy theorising can be rational lies at the heart of the philosophical debate. Two main positions have emerged: generalism (普遍主义), which holds that conspiracy theories are epistemically defective as a class, and particularism (个别主义), which holds that conspiracy theories should be evaluated case by case on their merits. The exchange between Neil Levy (2007), David Coady (2007), and M. R. X. Dentith (2019) maps the key contours of this dispute.
3.1 Levy’s Argument from Radically Socialized Knowledge
Neil Levy’s “Radically Socialized Knowledge and Conspiracy Theories” (2007) offers a sophisticated argument for generalism rooted in social epistemology.
3.1.1 Knowledge as Constitutively Social
Levy’s central premise is that knowledge is constitutively social (构成性社会化的). This means not merely that we happen to learn from others, but that many forms of knowledge are only accessible through an agent’s embedding in a community of epistemic agents. Advances in knowledge typically require ongoing immersion in a social epistemic environment: peer review, institutional credentialing, shared norms of evidence, and collaborative inquiry.
3.1.2 The Epistemic Cost of Conspiracy Theorising
Levy defines conspiracy theories as explanations that conflict with the accounts advanced by the relevant epistemic authorities (认识论权威). He then argues that embracing such a theory carries a distinctive epistemic cost: the conspiracy theorist must, to some degree, cut herself off from the very social epistemic environment that makes reliable inquiry possible. By rejecting the consensus of relevant experts, the conspiracy theorist forgoes the epistemic resources that would be needed to properly evaluate the conspiracy theory itself.
3.1.3 Propositional vs. Doxastic Rationality
Levy’s argument becomes more precise when we distinguish between two different senses of rationality. Propositional rationality (命题理性) concerns whether the evidence available supports a given belief: is there sufficient evidence to make it rational to hold this belief? Doxastic rationality (信念理性) concerns whether the belief was actually formed through a good process: was the belief formed in a way that reliably tracks truth?
A belief can be propositionally rational while being doxastically irrational. An agent who, by lucky accident, ends up holding a conspiracy theory that happens to be well-supported by the evidence—but who arrived at that belief through paranoid pattern-matching rather than careful inquiry—has a propositionally rational but doxastically irrational belief. Levy’s argument targets doxastic rationality: the process of conspiracy theorising, characterised by rejection of expert consensus and reliance on marginal sources, is a poor truth-tracking process even when it occasionally produces true beliefs.
3.1.4 Implications
Levy concludes that it is almost never rational for either laypersons or intellectuals to accept conspiracy theories. For laypersons, the case is straightforward: they lack the expertise to evaluate the relevant evidence independently. For intellectuals, the situation is more subtle but the conclusion is the same: by rejecting the relevant epistemic community, they place at risk the very capacities that might enable them to evaluate the theory correctly.
3.2 Coady’s Defence of Conspiracy Theorists
In “Are Conspiracy Theorists Irrational?” (2007), Coady responds directly to Levy, mounting a defence of conspiracy theorists’ rationality.
3.2.1 The Problem with Defining Irrationality
Coady argues that on the best available definitions of what it is to be a conspiracy theorist, conspiracy theorists do not deserve their reputation for irrationality. If a conspiracy theorist is simply someone who believes in explanations that conflict with official stories (Coady’s own 2003 definition), then many conspiracy theorists are clearly rational: history is full of cases where the official story was false and the “conspiracy theory” was true.
3.2.2 Challenging the Authority of Authorities
Coady challenges Levy’s reliance on epistemic authorities. He argues that Levy does not adequately distinguish between different kinds of authorities. Some authorities—such as government spokespeople—have well-documented track records of deception. Others—such as scientific communities—are more reliable, but even they are not infallible. The rational response is not to defer uncritically to all authorities, but to calibrate one’s trust according to the track record and incentive structure of the particular authority in question.
3.2.3 The Spectrum of Irrationality
Even if we adopt a definition that makes conspiracy theorists irrational by definition (for example, by defining them as people who believe conspiracy theories in the face of overwhelming evidence), Coady argues that this irrationality should be understood as one end of a spectrum. At the other end is the equally irrational disposition to believe official stories in the face of overwhelming evidence to the contrary. The more worrying form of irrationality, Coady suggests, may be at the naivety end of the spectrum, since naive trust in authorities can have more dangerous consequences than excessive suspicion.
3.3 Dentith’s Evidential Particularism
M. R. X. Dentith’s “Conspiracy Theories on the Basis of the Evidence” (2019) advances the particularist position by examining the evidential basis of conspiracy theories.
3.3.1 Against Prima Facie Scepticism
Dentith argues that there is no prima facie case for scepticism about conspiracy theories based on their use of evidence. The evidential problems typically associated with conspiracy theories—reliance on circumstantial evidence, appeals to anomalies, difficulty of verification—are not unique to conspiracy theories. They arise in many perfectly respectable forms of inquiry, including criminal investigations, historical research, and even natural science.
3.3.2 The Bayesian Question: What Prior Should We Assign?
One way to frame the generalism-particularism debate is in Bayesian terms. A generalist might argue that we should assign conspiracy theories a low prior probability (先验概率)—before examining the specific evidence—on the grounds that conspiracies are difficult to sustain, require many actors to remain silent, and are often exposed. A particularist would resist setting any uniform prior, arguing that the base rate for conspiracies is not low enough to warrant systematic scepticism, especially when we are talking about powerful actors with strong incentives and resources to maintain secrecy.
The historical record provides some guidance. Coady’s cases (Watergate, COINTELPRO, MKUltra) suggest that large-scale, long-running conspiracies by government agencies do occur. This is Bayesian evidence that the prior probability of such conspiracies is not negligible. Dentith’s particularism can be read as the position that our prior should be set by reference to this base rate, rather than by an intuition about conspiracies being inherently implausible.
3.3.3 The Particularist Conclusion
Dentith’s central conclusion is that each conspiracy theory should be evaluated on its own evidential merits. If a particular conspiracy theory is supported by strong evidence and generates successful predictions, it should be accepted regardless of the “conspiracy theory” label. If it is supported by weak evidence and relies on ad hoc modifications, it should be rejected—but so should any other theory with those features. The fact that an explanation posits a conspiracy is, by itself, epistemically irrelevant.
Chapter 4: Evidential Self-Insulation and Unfalsifiability
Even if particularism is correct in principle—even if there is nothing inherently irrational about positing a conspiracy—there remain distinctive epistemic pathologies associated with conspiracy theorising as a practice. Two papers, Napolitano (2021) and Duetz (2022), debate what these pathologies are and how they should be conceptualised.
4.1 Napolitano on Evidential Self-Insulation
M. Giulia Napolitano’s “Conspiracy Theories and Evidential Self-Insulation” (2021) proposes a novel account of what makes conspiracy theories epistemically distinctive.
4.1.1 The Self-Insulation Thesis
Napolitano argues that conspiracy theories are not best understood as a type of explanation (a theory about a conspiracy) but as a type of belief: specifically, a self-insulating belief in a conspiracy. On this view, what is distinctive about conspiracy theories is not their content—many true and well-confirmed explanations posit conspiracies—but the manner in which they are held.
4.1.2 The Epistemically Closed System
Napolitano’s analysis connects to the broader epistemological concept of an epistemically closed system (认识论封闭系统). An epistemically closed system is a framework of beliefs that is structured so as to make external challenge impossible in principle. The system assigns meaning to evidence only within its own interpretive framework, and that framework is itself immune to revision because any challenge to the framework is interpreted as further evidence for it.
Paradigm conspiracy theories exhibit this structure. Consider a theory according to which a shadowy elite controls the media, governments, and scientific institutions. Any mainstream report debunking the theory is ipso facto evidence that the media is controlled; any government denial is ipso facto evidence that the government is in on it; any scientific consensus is ipso facto evidence that scientists have been corrupted or silenced. The system is closed: the only evidence that could, even in principle, enter without being reinterpreted is evidence that confirms the theory.
4.1.3 Why Self-Insulation Is Irrational
Napolitano acknowledges that conspiracy theorists have a prima facie justification for some resistance to counter-evidence. If a conspiracy is genuinely occurring, the conspirators would presumably attempt to cover it up, planting misleading evidence and discrediting those who expose the conspiracy. Some degree of scepticism toward evidence that “conveniently” supports the official story is therefore rational.
However, Napolitano argues that the resistance to evidence typical of conspiracy theorising goes far beyond what this consideration can justify. Conspiracy theorists do not merely discount some counter-evidence; they treat all counter-evidence as further confirmation of the conspiracy. This blanket immunity to disconfirmation is epistemically indefensible, regardless of whether a genuine conspiracy is occurring.
4.1.4 The Seductiveness of Irrefutability
There is a psychological dimension to self-insulation that Napolitano’s analysis illuminates: irrefutability is not merely an epistemic vice—it is actively appealing. A belief that cannot be refuted provides a form of cognitive security unavailable to beliefs that remain open to challenge. Once a person has adopted a self-insulating conspiracy theory, they need not fear being proved wrong; every apparent counter-example can be neutralised. This security is epistemically purchased at enormous cost—the cost of genuine contact with reality—but psychologically it can feel like strength rather than weakness.
This observation connects to Nguyen’s later analysis of the seductions of clarity (Chapter 7): both clarity and irrefutability offer a kind of cognitive comfort that genuine inquiry, with its irreducible uncertainty, cannot provide.
4.1.5 Conspiracy Theories Are Always Irrational
Napolitano draws a strong conclusion: conspiracy theories, understood as evidentially self-insulated beliefs, are always irrational. This does not mean that conspiracies never happen, or that it is irrational to believe that a conspiracy has occurred. It means that holding a belief about a conspiracy in the self-insulating manner characteristic of conspiracy theorists is always an epistemic failing.
4.2 Falsifiability, Popper, and the Philosophy of Science Connection
The problem of evidential self-insulation connects naturally to Karl Popper’s falsifiability criterion (可证伪性标准), one of the most influential ideas in twentieth-century philosophy of science. Popper argued that what distinguishes genuine scientific hypotheses from pseudoscience is not that they are confirmed by evidence but that they are, in principle, capable of being refuted by evidence. An unfalsifiable claim—one that is compatible with any possible evidence—makes no genuine empirical claim about the world.
Conspiracy theories that exhibit the self-insulation Napolitano describes fail the Popperian test spectacularly. If no possible evidence could disconfirm the theory—if every piece of apparent counter-evidence is reinterpreted as further confirmation—then the theory is not an empirical hypothesis at all but a framework for interpreting the world that is immune to empirical challenge. On Popperian grounds, such theories are pseudoscientific.
There is, however, an important complication. Popper’s criterion, taken literally, would exclude many established scientific theories: the core of any theory can be protected from falsification by adjusting auxiliary hypotheses, as Duhem and Quine noted. The question is whether a theory is being systematically shielded from falsification by adding ad hoc auxiliary hypotheses, or whether adjustments are independently motivated. This is precisely Keeley’s and Clarke’s concern about conspiracy theories: they characteristically resort to the former.
4.3 Buzzell and Rini on Misinformation Ecology and Self-Insulation
Andrew Buzzell and Regina Rini’s work (2023) adds an important structural dimension to the self-insulation problem. Where Napolitano focuses on the cognitive dispositions of individual conspiracy theorists, Buzzell and Rini argue that self-insulation is reinforced and stabilised by the misinformation ecology (错误信息生态) of contemporary digital environments.
4.3.1 Structural Reinforcement of Self-Insulation
The misinformation ecology consists of algorithmically curated information environments that direct users toward content that confirms their existing beliefs, social networks that reward the performance of group identity, and content creators who have economic incentives to produce increasingly extreme confirmatory content. Within such an ecology, evidential self-insulation is not just a feature of individual psychology but is structurally reinforced at every point. Corrective information is algorithmically deprioritised; confirmatory information is amplified; dissenting voices are excluded or drowned out.
This structural dimension matters because it changes the locus of epistemic responsibility. If self-insulation were purely a feature of individual cognitive dispositions, the remedy would lie in individual epistemic education. But if it is structurally reinforced by the information environment, individual-level interventions are unlikely to be sufficient—the ecology itself must be addressed.
4.3.2 Anomaly Hunting
Buzzell and Rini also illuminate the anomaly hunting (异常追踪) pattern characteristic of conspiracy theorising: the practice of treating any anomaly, inconsistency, or unexplained detail in the official story as confirmation of the conspiracy. Anomaly hunting exploits a genuine feature of complex events—all complex events have anomalies, loose ends, and unexplained details—to create the appearance that the official story cannot be true.
The anomaly hunting pattern is epistemically perverse because it applies an asymmetric evidential standard. The conspiracy theorist demands perfection from the official story—any unexplained detail is treated as a fatal objection—while applying no comparable standard to the conspiracy theory, which typically contains far more unexplained details and is never tested against the evidence with the same rigour.
4.4 Duetz’s Critique: Conspiracy Theories Are Not Beliefs
J. C. M. Duetz’s “Conspiracy Theories Are Not Beliefs” (2022) mounts a direct challenge to Napolitano’s account, arguing that it conflates conspiracy theories with conspiracy beliefs.
4.4.1 The Conflation Problem
Duetz argues that Napolitano’s account fails to distinguish between two separate phenomena:
- A conspiracy theory (阴谋论): an explanation or hypothesis about an event that posits a conspiracy.
- A conspiracy belief (阴谋信念): the psychological state of believing such an explanation, which may or may not involve evidential self-insulation.
Napolitano defines conspiracy theories as a type of belief (self-insulating beliefs in conspiracies). Duetz argues that this definition collapses the theory into the belief, making it impossible to say that the same conspiracy theory could be held rationally by one person and irrationally by another. But this is precisely what we should be able to say: the theory itself is a proposition or explanation that can be evaluated on its merits; the manner in which an individual holds that theory is a separate psychological question.
4.4.2 Defending the Minimalist Account
Duetz defends the Minimalist Account (最小化立场) of conspiracy theories—roughly, Keeley’s original definition according to which a conspiracy theory is simply an explanation that appeals to a conspiracy. Duetz argues that this neutral definition is more theoretically fruitful than Napolitano’s loaded definition, because it allows us to ask all the important questions (When is belief in a conspiracy theory rational? What makes some conspiracy beliefs resistant to evidence?) without building the answers into the definition.
4.4.3 Implications for Research
The distinction between theories and beliefs has important implications for empirical research on conspiracy theorising. If we conflate the two, we risk treating all believers in conspiracy theories as epistemically pathological, even when their beliefs are well-supported. The minimalist account keeps the descriptive and evaluative questions separate, allowing for more nuanced inquiry.
Chapter 5: “Doing Your Own Research”: Epistemic Autonomy
A recurring theme in conspiracy theory discourse is the injunction to “do your own research.” This chapter examines the philosophical problems with this ideal of epistemic self-reliance, drawing on Buzzell and Rini (2023), Stokes (2018), Nguyen (2020, 2021), and Cassam (2019).
5.1 Buzzell and Rini on Epistemic Superheroism
Andrew Buzzell and Regina Rini’s “Doing Your Own Research and Other Impossible Acts of Epistemic Superheroism” (2023) analyses the epistemic ideal that drove much of the misinformation during the COVID-19 pandemic.
5.1.1 The Ideal of Epistemic Autonomy
The injunction to “do your own research” (DYOR) rests on an ideal of epistemic autonomy (认识论自主性)—the idea that rational agents should form their beliefs independently, by personally examining the relevant evidence, rather than relying on the testimony of others.
Buzzell and Rini argue that this ideal, while superficially attractive, is deeply unrealistic. Modern knowledge is radically distributed (根本性分布的): no individual can possess more than a tiny fraction of the knowledge needed to evaluate complex empirical claims about epidemiology, vaccine safety, climate science, or geopolitics. We inevitably depend on others—experts, institutions, sociotechnical systems—to mediate evidence and testimony.
5.1.2 The Division of Cognitive Labour
A foundational insight of social epistemology is that sophisticated knowledge production requires a division of cognitive labour (认知劳动分工). Just as economic production is more efficient when workers specialise, knowledge production is more efficient and more reliable when epistemic agents specialise in different domains and share their expertise. The system of peer review, professional credentialing, replication, and criticism that constitutes modern science is a highly evolved mechanism for producing reliable knowledge through divided labour.
The injunction to “do your own research” implicitly rejects this division. It treats the non-specialist layperson’s personal investigation as epistemically equivalent to—or superior to—the coordinated efforts of specialist communities. This is not epistemic modesty; it is a form of epistemic arrogance, and it is self-undermining: the layperson lacks the training to recognise the limits of her own competence in the relevant domain.
5.1.3 The Epistemic Superhero
Buzzell and Rini argue that the epistemic superhero faces three interlocking problems:
- Informational overload: The sheer volume of relevant evidence overwhelms any individual’s capacity to process it. The epistemic superhero cannot possibly evaluate all the evidence bearing on a complex question.
- The Dunning-Kruger problem: The less one knows about a domain, the less able one is to recognise one’s own ignorance. Epistemic superheroism therefore tends to produce overconfidence in the very people least equipped to evaluate the evidence.
- Vulnerability to manipulation: By withdrawing trust from established epistemic institutions, the epistemic superhero becomes dependent on whatever information sources happen to be most accessible or most psychologically compelling—which, in the contemporary information environment, are often sources of misinformation.
5.1.4 When Deference to Experts Is Rational—and When It Is Not
Epistemic deference to experts is rational when the following conditions are met: (a) there is a genuine expert community in the relevant domain with recognised standards of evidence and accountability; (b) the experts have a strong track record of reliability on the relevant type of question; (c) the layperson lacks the background knowledge to evaluate the evidence independently; and (d) the experts are not subject to strong incentive pressures that would bias their testimony.
Deference becomes irrational when these conditions fail. A legitimate reason to question expert consensus arises, for example, when experts have significant financial or political ties to the conclusions they endorse (as in early tobacco research), when the expert community has a history of motivated error on similar questions, or when independent credentialed experts in the same field substantially disagree. The key point is that legitimate scepticism about expert consensus is targeted and evidence-based, not blanket and reflexive.
5.1.5 Social Epistemology as the Alternative
The alternative to epistemic superheroism is not blind trust, but calibrated epistemic dependence (校准的认识论依赖): a disposition to rely on others in proportion to their demonstrated reliability, while remaining open to evidence that particular sources are untrustworthy. This is the normal condition of rational agents in a complex society, and it is nothing to be ashamed of.
5.2 Nguyen on Echo Chambers and the Corruption of Autonomy
C. Thi Nguyen’s analysis of echo chambers (2020) adds an important dimension to the discussion of epistemic autonomy. Nguyen argues that echo chambers do not merely deprive their members of information—they actively corrupt the capacity for independent thinking by producing an illusion of independent inquiry.
5.2.1 The Autonomy Illusion
Within an echo chamber, members believe they are thinking for themselves. They have investigated the evidence, evaluated competing claims, and reached their own conclusions. But the investigation was conducted within an environment that had already systematically excluded disconfirming evidence and pre-emptively discredited outside sources. The “independent inquiry” was therefore not independent at all—it was conducted within an epistemically controlled space designed to produce a particular conclusion.
This is what makes echo chambers more epistemically dangerous than epistemic bubbles. An epistemic bubble leaves its members unaware of certain information; an echo chamber leaves its members believing they have considered and rejected that information when they have actually only been exposed to a heavily distorted version of it, pre-packaged for dismissal. The echo chamber produces an illusion of independence (独立性幻觉) that makes its members resistant to recognising their own epistemic dependence.
5.2.2 Conspiracy Theorising and the Autonomy Value
The connection to conspiracy theorising is direct. A central appeal of conspiracy theorising communities is that they present themselves as bastions of independent thinking—places where you can break free from the “mainstream media” and “think for yourself.” This rhetoric of autonomy is part of the echo chamber’s structure: it reinforces members’ sense that they have achieved epistemic independence while actually installing a more thoroughgoing epistemic dependence on the community’s approved sources and interpretive frameworks.
Buzzell and Rini’s argument is complementary: the slogan “do your own research” is the rhetorical vehicle through which the echo chamber recruits new members by flattering their sense of epistemic self-sufficiency while directing them toward a curated information environment.
5.3 Cassam on Intellectual Vices
Quassim Cassam (2019) approaches the epistemology of conspiracy theorising from a virtue-theoretic direction. Rather than asking about the epistemic quality of conspiracy theories as explanations, Cassam asks about the intellectual vices (理智恶德) of conspiracy theorists—the character traits and cognitive dispositions that sustain conspiracy theorising.
5.3.1 Gullibility, Closed-Mindedness, and Epistemic Cowardice
Cassam identifies several intellectual vices that tend to characterise conspiracy theorising. Gullibility (轻信性) involves a disposition to accept claims on insufficient evidence, particularly when they flatter one’s self-image as a member of a knowing elite. Closed-mindedness (偏执) involves a refusal to seriously consider evidence and arguments that challenge one’s existing views. Epistemic cowardice (认识论怯懦) involves a disposition to avoid the discomfort of genuine uncertainty by retreating to emotionally satisfying but poorly evidenced narratives.
These vices are not merely cognitive failures—they are character traits that can be cultivated or reformed over time, and they interact in mutually reinforcing ways. A conspiracy theorist who is closed-minded to counter-evidence will tend, over time, to become more gullible about confirmatory claims (having developed no practice of critical scrutiny), and more epistemically cowardly (having learned to rely on the comfort of the conspiracy narrative as a refuge from complexity).
5.3.2 Stokes on the Perils of Pure Particularism
Patrick Stokes’s “Conspiracy Theory and the Perils of Pure Particularism” (2018) addresses the particularist position from a different angle, arguing that even if particularism is theoretically correct, pure particularism is practically dangerous.
Stokes grants that strict generalism—the view that all conspiracy theories are irrational—is untenable. But he argues that the move from generalism to unrestricted particularism is equally problematic. Pure particularism treats conspiracy theories merely as a formal category of explanation and evaluates each one entirely on its specific evidence. This ignores the fact that conspiracy theorising is a social practice (社会实践) with a history, a culture, and a repertoire of characteristic moves.
Stokes introduces an important dimension that is often absent from purely epistemological discussions: the moral costs (道德代价) of conspiracy theorising. Conspiracy theories characteristically involve making serious accusations against identified individuals or groups—accusations of murder, corruption, or fraud—often before the evidence is sufficient to warrant such charges. These accusations can cause real harm to the accused, regardless of whether the conspiracy theory is true.
The upshot of Stokes’s argument is that epistemic humility and reticence are not obstacles to inquiry but virtues. The recognition that conspiracy theorising involves accusing real people of serious wrongdoing should make us cautious about endorsing conspiracy claims without strong evidence. This is a form of intellectual responsibility, not intellectual cowardice.
Chapter 6: Echo Chambers and the Spread of Conspiracy Theories
How do conspiracy theories spread, and what social-epistemic structures sustain them? This chapter examines C. Thi Nguyen’s (2020) influential distinction between echo chambers and epistemic bubbles, Sunstein and Vermeule’s (2009) analysis of the causes and proposed cures for conspiracy theorising, and the structural role of digital platforms.
6.1 Nguyen on Echo Chambers and Epistemic Bubbles
C. Thi Nguyen’s “Echo Chambers and Epistemic Bubbles” (2020) draws a sharp distinction between two social-epistemic structures that are often conflated in public discourse.
6.1.1 Two Distinct Structures
The difference is crucial. An epistemic bubble is a problem of omission: certain voices are not heard. An echo chamber is a problem of active exclusion: outside voices are heard but are pre-emptively dismissed as untrustworthy, biased, or part of a conspiracy.
6.1.2 Why the Distinction Matters
Nguyen argues that the distinction has profound practical implications:
- Epistemic bubbles are relatively easy to burst. Simply exposing someone to the missing information or perspectives can shatter an epistemic bubble. If you have never heard the counter-argument, hearing it may change your mind.
- Echo chambers are extremely difficult to escape. Presenting counter-evidence to someone inside an echo chamber may actually reinforce the chamber, because the echo chamber has already primed its members to interpret outside information as hostile propaganda. The very act of presenting contrary evidence can be taken as proof that the outside world is trying to deceive them.
6.1.3 Echo Chambers and Conspiracy Theories
Nguyen’s analysis illuminates why conspiracy theories are so resilient once they take hold within a community. Conspiracy-theorising communities often function as echo chambers: they do not merely lack exposure to contrary evidence (an epistemic bubble) but have been systematically taught to distrust all sources of contrary evidence (an echo chamber). Mainstream media, government agencies, scientific institutions—all are cast as part of the conspiracy, and their testimony is therefore not just ignored but treated as further evidence for the conspiracy.
6.1.4 Group Polarization Inside the Echo Chamber
Nguyen’s structural analysis also connects to the well-documented phenomenon of group polarization (群体极化). When like-minded individuals deliberate together, their views tend to shift toward more extreme versions of their initial positions. Inside conspiracy-theorising communities, this dynamic amplifies conspiratorial beliefs over time: members compete to offer ever more elaborate and definitive versions of the theory, and moderation is socially penalised. Group polarization explains why conspiracy forums tend to produce increasingly extreme beliefs rather than gradual convergence toward more moderate positions.
6.2 Sunstein and Vermeule on Causes and Cures
Cass Sunstein and Adrian Vermeule’s “Conspiracy Theories: Causes and Cures” (2009) is one of the most discussed—and most controversial—contributions to the policy-oriented literature on conspiracy theories.
6.2.1 Crippled Epistemology
Sunstein and Vermeule argue that conspiracy theories typically arise not from irrationality per se but from a crippled epistemology. Conspiracy theorists are often rational processors of the information available to them; the problem is that their informational environment is severely distorted. This can result from social isolation, membership in insular communities, or exposure to a biased information diet.
This is an important and philosophically significant move. It shifts the locus of the epistemic problem from the individual believer to the information environment. Conspiracy theorists with crippled epistemologies are not stupid or crazy; they are making reasonable inferences from an impoverished and skewed informational base. This diagnosis has different implications for how we should respond to conspiracy theories than a diagnosis based on individual irrationality or cognitive deficiency.
6.2.2 Informational and Reputational Cascades
The authors identify two mechanisms through which conspiracy theories spread:
- Informational cascades (信息级联): When individuals observe others endorsing a conspiracy theory, they may take this as evidence that the theory is true, especially if they lack independent evidence to the contrary. Each new endorsement increases the perceived credibility of the theory, creating a self-reinforcing cycle.
- Reputational cascades (声誉级联): When endorsing a conspiracy theory becomes a marker of group identity, individuals may affirm the theory to maintain their standing within the group, regardless of their private beliefs. The social cost of dissent can be very high.
6.2.3 The Cognitive Infiltration Proposal
Sunstein and Vermeule’s most controversial proposal is cognitive infiltration (认知渗透): the idea that government agents or their allies should covertly enter conspiracy-theorising communities and introduce information that undermines the conspiracy theory from within. The rationale is that direct confrontation from outside sources is likely to be dismissed by conspiracy theorists (consistent with Nguyen’s analysis of echo chambers), so change must come from within the community.
6.2.4 Do Government Counter-Speech Programs Work?
The cognitive infiltration proposal is one instance of a broader question: can counter-speech programs (反制话语计划)—government or civil-society efforts to correct conspiracy theories through direct engagement—effectively reduce conspiracy belief? The empirical literature is mixed. Some studies find that corrections modestly reduce belief in specific conspiracy theories; others find that corrections produce a backfire effect (适得其反效应), strengthening conspiracy beliefs among those most committed to them. The theoretical framework provided by Nguyen suggests why: corrections presented by outside sources will be filtered through the echo chamber’s pre-existing distrust of those sources, potentially reinforcing rather than undermining the theory.
This implies that the most effective counter-speech strategies may be those that address the structural conditions sustaining conspiracy theories—information poverty, social isolation, institutional distrust—rather than those that directly challenge specific theories.
6.2.5 The Self-Sealing Quality
Sunstein and Vermeule also identify the self-sealing quality (自我封闭性) of conspiracy theories: attempts to dispel conspiracy theories can be interpreted by conspiracy theorists as further evidence for the conspiracy. This creates a dilemma for those who wish to combat conspiracy theories: engaging with them risks reinforcing them, but ignoring them allows them to spread unchecked.
6.3 Algorithmic Amplification and the Digital Information Environment
The structural analysis of conspiracy theory spread has taken on new dimensions in the age of social media. Platform algorithms are designed to maximise engagement (参与度)—the time users spend on the platform, the content they interact with, the emotional reactions they register. Research consistently shows that outrage, fear, and novelty are among the most powerful engagement drivers. Conspiracy theories, which combine all three, are particularly well-suited to algorithmic amplification.
The result is a structural incentive alignment between platform architecture and conspiracy theory spread. Platforms profit from engagement; conspiracy theories drive engagement; therefore platforms algorithmically amplify conspiracy theories, regardless of their truth value. This is not a deliberate conspiracy—it is an emergent property of the incentive structure—but it produces a systematic bias toward amplifying misleading content. The Sunstein-Vermeule framework of crippled epistemology takes on new force in this context: the scale of algorithmic curation means that the informational environment of millions of people can be systematically distorted in ways that promote conspiracy belief.
Chapter 7: The Seductions of Clarity and Conspiracy as Play
This chapter explores two accounts of why conspiracy theories are psychologically attractive, moving beyond the familiar explanations based on ignorance or irrationality. Nguyen (2021) argues that conspiracy theories exploit our reliance on the feeling of clarity, while Levy (2023) proposes that many conspiracy theorists engage with conspiracy theories as a form of play rather than sincere belief.
7.1 Nguyen on the Seductions of Clarity
C. Thi Nguyen’s “The Seductions of Clarity” (2021) offers a novel account of the psychological appeal of conspiracy theories, grounded in an analysis of how we use cognitive feelings to guide inquiry.
7.1.1 Clarity as a Cognitive Feeling
Nguyen argues that clarity is not the same as understanding. Genuine understanding involves grasping the relevant causal structures, mechanisms, and complexities of a phenomenon. The feeling of clarity is merely the subjective sense that one understands. These can come apart: we can feel that we understand something when we do not, and we can genuinely understand something without feeling clear about it.
7.1.2 Clarity as a Thought-Terminating Heuristic
In everyday cognitive life, we use the feeling of clarity as a thought-terminating heuristic (终结思考的启发法): when we feel that we understand something, we stop inquiring into it. This is usually efficient—we cannot investigate everything forever—but it creates a vulnerability. If someone can induce an artificial sense of clarity, they can cause us to stop inquiring prematurely, before we have noticed the flaws in a belief system.
7.1.3 Conspiracy Theories as Seductively Clear
Conspiracy theories are, on Nguyen’s analysis, paradigmatically seductive in this way. They offer to explain complex, messy, confusing events in terms of a single, coherent narrative with clear agents, clear motives, and clear causal chains. The world becomes intelligible: everything happens for a reason, and that reason is the intentional action of identifiable agents. This narrative simplicity produces a powerful feeling of clarity, which in turn terminates further inquiry.
7.1.4 Dead-End Epistemology
Nguyen’s framework suggests a further problem with conspiracy theories beyond their falsity: they function as dead-end epistemology (死胡同认识论). Once a conspiracy theory is in place, the feeling of clarity it produces terminates further inquiry. There are no more questions to ask, no more evidence to seek—the answer is known. This is epistemically catastrophic because genuine understanding of complex events requires continued inquiry, and complex events (political crises, public health emergencies, economic failures) are precisely the events that most demand ongoing investigation.
The conspiracy theory replaces the open-ended process of inquiry with a closed, pre-packaged answer. In this sense, it functions less as an explanation than as an explanation-terminator: it prevents the kind of sustained, self-critical inquiry that genuine understanding requires.
7.1.5 Epistemic Manipulation
Nguyen’s analysis has important implications for understanding epistemic manipulation. Epistemic manipulators (认识论操纵者) can exploit the seductions of clarity by designing belief systems that maximise the feeling of understanding while minimising genuine understanding. Conspiracy theories, propaganda, and certain ideological systems all share this structure: they offer simple, satisfying explanations that discourage further investigation.
7.2 Motivated Reasoning and the Corruption of Epistemic Virtues
Nguyen’s analysis of clarity connects to the broader psychological literature on motivated reasoning (动机性推理). Motivated reasoning occurs when an agent’s desire to reach a particular conclusion influences the process of belief formation, causing them to apply different evidential standards to evidence that supports versus challenges the desired conclusion.
What is distinctive about conspiracy theories is that they can corrupt normal epistemic virtues. Curiosity—ordinarily a productive epistemic disposition—can be redirected into anomaly hunting, where it produces an endless supply of apparent disconfirmations of the official story rather than genuine understanding. Pattern recognition—ordinarily a valuable cognitive skill—can be over-applied to find meaningful connections in random noise. Critical thinking—ordinarily a safeguard against misinformation—can be deployed selectively, applied rigorously to official accounts but not to conspiracy narratives. In each case, a genuine epistemic virtue is harnessed in service of a pre-formed conclusion.
7.3 Levy on Conspiracy Theories as Serious Play
Neil Levy’s “Conspiracy Theories as Serious Play” (2023) proposes a provocative reinterpretation of what many conspiracy theorists are doing when they engage with conspiracy theories.
7.3.1 The Serious Play Hypothesis
Levy argues that for many conspiracy theorists, engagement with conspiracy theories is a form of serious play rather than sincere belief. Just as a viewer of a conspiracy thriller suspends disbelief for the duration of the film, many conspiracy theorists suspend disbelief in their theories without committing to them as settled convictions. They enjoy the intellectual puzzle, the feeling of having special knowledge, and the social bonds formed through shared speculation.
7.3.2 The Pleasures of Puzzle-Solving and Aesthetic Dimensions
Levy’s account draws attention to the aesthetic dimensions (审美维度) of conspiracy theorising that purely epistemological analyses overlook. Conspiracy theories have a distinctive aesthetic: they offer the pleasures of detective fiction—hidden clues, surprising connections, revelatory moments—combined with the flattery of being among the initiated few who can see through the deception.
This aesthetic appeal is not trivial. Human beings are drawn to puzzles, patterns, and narratives with hidden depths. The activity of conspiracy theorising—investigating anomalies, finding connections, decoding hidden meanings—mimics the pleasures of genuine inquiry. This helps explain why conspiracy theorising can be appealing even to people who, in their ordinary lives, have no particular epistemic vices: they are drawn by the aesthetic pleasures of the form rather than by cognitive deficiency.
7.3.3 Evidence for the Serious Play Hypothesis
Levy points to several features of conspiracy-theorising communities that are better explained by the serious play hypothesis than by the assumption of sincere belief:
- Gamification: Conspiracy-theorising communities often treat the investigation of conspiracy theories as a game, with puzzles to solve, clues to decode, and levels of initiation. This ludic structure is more consistent with play than with sincere inquiry.
- Inconsistency: Conspiracy theorists often hold mutually inconsistent theories simultaneously, or shift between theories without apparent concern. This is puzzling if they sincerely believe these theories, but natural if they are engaging in play.
- Relationship to evidence: Conspiracy theorists’ characteristic relationship to evidence—embracing confirming evidence enthusiastically while dismissing disconfirming evidence—resembles the selective engagement with evidence characteristic of play more than the systematic evaluation characteristic of sincere inquiry.
7.3.4 Why “Play” Can Still Be Harmful
The fact that conspiracy theorising may be a form of play does not make it harmless. Levy’s concept of “serious” play emphasises that play can have real-world consequences. Players may act on the basis of their play-beliefs, spreading misinformation, harassing individuals accused of conspiracy, or making political decisions influenced by conspiracy narratives. The psychological boundary between play-belief and sincere belief is also porous: what begins as entertaining speculation can, over time, harden into genuine conviction.
Chapter 8: The Politics of Conspiracy Theories
The final chapter turns from the epistemology of conspiracy theories to their politics. Quassim Cassam (2019) argues that conspiracy theories are fundamentally a form of political propaganda, while Matthew Shields (2022) challenges the assumption that conspiracy theorising is the province of fringe extremists.
8.1 Cassam on Conspiracy Theories as Propaganda
Quassim Cassam’s Conspiracy Theories (2019), particularly Chapter 1 (“The Real Point of Conspiracy Theories”), reframes the entire debate about conspiracy theories by shifting attention from their epistemic status to their political function.
8.1.1 The Functionalist Approach
Cassam argues that the defining characteristics of conspiracy theories—their speculative character, their resistance to evidence, their appeal to hidden agents—are best explained not by the cognitive deficiencies of their adherents but by the political purposes they serve. Conspiracy theories advance ideological objectives: they discredit opponents, mobilise supporters, and provide narratives that justify particular political programmes.
8.1.2 Conspiracy Theories as Weapons Against Epistemic Institutions
Cassam’s political analysis identifies a particularly corrosive function: conspiracy theories as weapons against epistemic institutions (对认识论机构的攻击工具). Epistemic institutions—universities, scientific bodies, quality journalism, independent courts—are the infrastructure through which modern societies produce and adjudicate knowledge. Conspiracy theories that systematically undermine trust in these institutions do not merely spread misinformation about particular events; they degrade the shared epistemic environment on which democratic deliberation depends.
When a sufficiently large proportion of a population believes that scientific consensus is manufactured, that courts are rigged, that journalism is fake, and that elections are stolen, the common epistemic ground required for democratic politics erodes. Political disagreements can no longer be resolved by appeal to shared evidence because the very concept of shared evidence has been contaminated. This is one of the deepest political harms of conspiracy theorising.
8.1.3 Conspiracy Theories as Propaganda
On Cassam’s account, conspiracy theories are a form of political propaganda (政治宣传). This does not mean that all conspiracy theorists are cynical manipulators who know their theories are false. Some are; but many sincerely believe what they propagate. The point is that conspiracy theories function as propaganda regardless of the intentions of those who spread them. Their effect is to advance a political or ideological cause by means of claims that are, at best, poorly supported by evidence.
8.1.4 Three Defining Features
Cassam identifies three features that characterise conspiracy theories:
- Speculative (推测性的): Conspiracy theories go well beyond the available evidence, postulating elaborate hidden mechanisms and agents.
- Contrarian (反主流的): They are contrary to the official account and the obvious explanation of events.
- Esoteric (深奥的): They claim access to hidden knowledge unavailable to the general public.
These features make conspiracy theories particularly effective as propaganda: their speculative character makes them flexible, their contrarian character makes them appealing to those who distrust institutions, and their esoteric character creates a sense of in-group belonging among those who “know the truth.”
8.1.5 Implications for Response
If conspiracy theories are fundamentally political rather than epistemic, then the response to them must also be political rather than purely epistemic. Providing better evidence or more rigorous arguments may not be sufficient if the conspiracy theory’s appeal is rooted in political identity, group solidarity, or ideological commitment. Cassam suggests that addressing the political conditions that make conspiracy theories attractive—inequality, alienation, institutional distrust—is at least as important as debunking individual theories.
8.2 The Asymmetry Debate
A contested empirical and normative question in the study of conspiracy theories concerns whether they are distributed symmetrically across the political spectrum or disproportionately concentrated on one side. The asymmetry thesis (不对称论题) holds that in contemporary Western democracies, conspiracy theories are significantly more prevalent on the political right than the left.
8.2.1 Evidence For and Against Asymmetry
Some empirical research supports the asymmetry thesis: surveys of belief in specific conspiracy theories (e.g., climate denial, QAnon, election fraud claims) find higher rates among right-leaning respondents. This is often explained by reference to the psychological correlates of political conservatism—lower tolerance for uncertainty, higher need for cognitive closure—which would make right-leaning individuals more susceptible to the sense-making appeal of conspiracy theories.
Critics of the asymmetry thesis point out, however, that it is susceptible to confirmation bias. Researchers are often drawn to studying right-wing conspiracy theories because they are more visible in political discourse; left-wing conspiracy theories (e.g., certain narratives about police killings, corporate malfeasance, or government surveillance) may receive less attention. The asymmetry may reflect the biases of researchers and media rather than the actual distribution of conspiracy belief.
8.2.2 The Normative Stakes
The asymmetry debate has significant normative stakes. If conspiracy theories are primarily a right-wing phenomenon, then addressing them may appear to be a partisan project—a way of attacking the political right under the guise of epistemological concern. If conspiracy theories are symmetrically distributed, then addressing them can present itself as a genuinely neutral epistemic enterprise. The perceived neutrality of the enterprise affects its credibility and its practical effectiveness.
8.3 Shields on Rethinking the Politics of Conspiracy Theories
Matthew Shields’s “Rethinking Conspiracy Theories” (2022) challenges a widespread assumption in both the philosophical literature and public discourse: that conspiracy theories are primarily produced and consumed by marginal, extremist groups.
8.3.1 Dominant vs. Non-Dominant Institution Conspiracy Theories
Shields argues that the philosophical literature on conspiracy theories has overwhelmingly focused on non-dominant institution conspiracy theories—the theories of QAnon followers, flat-earthers, anti-vaxxers, and similar groups—while ignoring or downplaying dominant institution conspiracy theories. But the latter are far more consequential.
8.3.2 The Political Economy of Conspiracy Theories
Shields advances what might be called a political economy of conspiracy theories (阴谋论的政治经济学): an analysis of how conspiracy theories are produced, distributed, and consumed as a function of institutional power and economic interests. On this analysis, conspiracy theories are not simply products of individual irrationality or informational deprivation; they are also products of deliberate strategies by powerful actors who benefit from spreading them.
Intelligence agencies, corporate communications departments, and political operatives all have resources to fund, produce, and distribute conspiracy theories when it serves their interests. The tobacco industry’s conspiracy to suppress evidence of smoking’s harms was itself propagated through the production and circulation of an alternative theory—that the scientific consensus on smoking was manufactured by anti-industry researchers. Dominant institutions are, in Shields’s framework, among the most prolific producers of conspiracy theories.
8.3.3 The Case Against Generalism, Revisited
Shields argues that generalists—those who claim that conspiracy theories are epistemically defective as a class—uncritically centre non-dominant institution conspiracy theories in their analysis. When we look at dominant institution conspiracy theories, the generalist position becomes much harder to sustain:
- Expertise: Dominant institution conspiracy theories are often produced by individuals with genuine expertise and institutional backing—intelligence analysts, government officials, established journalists. The generalist claim that conspiracy theorists lack relevant expertise does not apply here.
- Mainstream ideology: Dominant institution conspiracy theories are often embedded in mainstream, purportedly non-extremist political ideologies. The generalist claim that conspiracy theories are products of extremism does not apply here.
- Consequences: Dominant institution conspiracy theories have historically been far more harmful than non-dominant ones. The claim that Iraq possessed weapons of mass destruction, for example, was a conspiracy theory (it posited a hidden conspiracy by the Iraqi government) produced by dominant institutions, and it was used to justify a war that caused hundreds of thousands of deaths.
8.3.4 Implications for the Field
Shields’s analysis has significant implications for the study of conspiracy theories. If we take seriously the existence and harmfulness of dominant institution conspiracy theories, we cannot maintain a blanket scepticism toward conspiracy theories without also becoming sceptical of the claims made by powerful institutions—which is precisely the attitude that generalists typically criticise. The field must grapple with the uncomfortable fact that the most dangerous conspiracy theories may come not from the margins but from the centre of political power.
8.4 The Corrosive Effects on Democracy and a Synthesizing Conclusion
8.4.1 Erosion of Shared Epistemic Ground
Democracy requires, at minimum, a shared epistemic ground: citizens must be able to recognise a common set of facts, accept the outcomes of common procedures for adjudicating disputes, and engage in deliberation on the basis of shared evidence. Conspiracy theories, in both their dominant and non-dominant institutional forms, erode this shared ground.
When substantial portions of the electorate believe that election results are fabricated, that public health guidance is a tool of population control, or that climate science is a hoax, they cannot participate fully in democratic deliberation—they inhabit a different epistemic world from their fellow citizens. The fragmentation of epistemic commons (认识论公地的碎片化) is among the most serious political consequences of widespread conspiracy theorising.
This consequence is politically asymmetric in a significant sense: those who benefit from the erosion of democratic legitimacy have strong incentives to promote conspiracy theories that undermine trust in democratic institutions, while those who benefit from stable democratic governance have incentives to support epistemic institutions. The political economy of conspiracy theories thus connects directly to the political economy of democratic stability.
8.4.2 What Philosophy Contributes
The philosophical study of conspiracy theories makes several distinctive contributions that empirical social science alone cannot provide.
First, philosophy clarifies the conceptual landscape (概念图景): the distinctions between conspiracy theories and conspiracy beliefs, between warranted and unwarranted conspiracy theorising, between epistemic bubbles and echo chambers, and between the pejorative and neutral uses of “conspiracy theory” are conceptual achievements that provide the framework within which empirical research can be designed and interpreted.
Second, philosophy provides normative standards: it identifies the epistemic virtues and vices relevant to conspiracy theorising, the conditions under which expert deference is rational, and the standards of evidence that should govern the assessment of conspiracy claims. These normative standards are not given by empirical research; they require philosophical argument.
Third, philosophy connects the epistemology of conspiracy theories to broader questions about knowledge, power, and democracy: how epistemic institutions function, what makes them trustworthy, how they can fail, and what conditions are required for genuine democratic deliberation. These connections illuminate why conspiracy theorising is not merely an intellectual curiosity but a phenomenon with deep political significance.
Summary: Key Themes and Debates
The philosophical study of conspiracy theories has matured rapidly since Keeley’s foundational 1999 paper. Several cross-cutting themes emerge from the literature surveyed in this course:
1. Definition matters. Whether we adopt a minimalist or a loaded definition of “conspiracy theory” profoundly shapes all subsequent questions about rationality, blame, and response. Minimalists (Keeley, Duetz) keep the term neutral; others build epistemic or political features into the definition (Napolitano, Cassam). Clarke’s revisionary proposal—that conspiracy theories are defined by their positing of unjustified agency—makes them epistemically defective by definition, a move resisted by particularists.
2. The generalism-particularism debate remains unresolved. Generalists (Levy 2007, Sunstein and Vermeule) argue for systematic scepticism toward conspiracy theories. Particularists (Coady, Dentith) argue for case-by-case evaluation. Intermediate positions (Stokes) advocate constrained particularism tempered by moral and epistemic caution. The Bayesian question of what prior probability to assign conspiracy theories cuts across this debate.
3. Epistemic autonomy is more limited than we think. Buzzell and Rini show that the ideal of “doing your own research” is often self-defeating. Levy (2007) argues that knowledge is constitutively social. The rational response is not radical autonomy but calibrated trust. Nguyen shows that echo chambers corrupt the autonomy value by producing an illusion of independent thinking.
4. Social-epistemic structures sustain conspiracy theories. Nguyen’s distinction between echo chambers and epistemic bubbles illuminates why conspiracy theories are so resistant to correction. Sunstein and Vermeule’s analysis of informational and reputational cascades explains how they spread. Algorithmic amplification in digital environments adds a structural dimension that magnifies these dynamics at scale.
5. The appeal of conspiracy theories is not purely cognitive. Nguyen (2021) points to the seductions of clarity and the dead-end epistemology of thought-terminating narratives; Levy (2023) points to the pleasures of play and the aesthetic appeal of puzzle-solving. Motivated reasoning can corrupt genuine epistemic virtues. Conspiracy theories satisfy psychological needs that rational argument alone may not address.
6. The politics of conspiracy theories cannot be separated from their epistemology. Cassam argues that conspiracy theories are propaganda and weapons against epistemic institutions; Shields argues that the most dangerous conspiracy theories come from dominant institutions. The erosion of shared epistemic ground constitutes a fundamental threat to democratic deliberation. The study of conspiracy theories is, inescapably, a study of power.