Why are there so many rationalist cults?

What “Rationalism” Means Here

  • Thread distinguishes philosophical rationalism from the internet “Rationalist” scene clustered around LessWrong, The Sequences, EA, AI risk, Bayes, etc.
  • Some argue the label “rationalist” is intrinsically arrogant or cult-bait; others say it just denotes “trying to avoid cognitive biases via evidence and reasoning.”
  • Confusion is increased by overlap with Silicon Valley culture, effective altruism, and adjacent online subcultures.

Why This Milieu Produces Cults

  • Loneliness, loss of traditional community, and desire for meaning make people vulnerable to intense, high-commitment groups.
  • Rationalist meetups, group houses, and Burning Man camps can morph into high-demand micro-communities: isolation from outsiders, shared jargon, escalating “heroic” missions (save the world, fix AI, cure the leader’s depression).
  • Narcissistic or unstable leaders exploit this: classic cult pattern of adulation, sexual access, financial and emotional control, justified as “rational” or “for the greater good.”
  • Several commenters think the groups named in the article are essentially standard cults that happened to recruit from a rationalist-heavy pool.

Critiques of Rationalist Practice

  • Overconfidence: chaining many “rational” inferences from shaky premises, while underweighting error and uncertainty, leads to wild conclusions believed with high confidence.
  • Disdain for intuition, norms, and “mainstream epistemology” removes important safety rails; people who “outperform society” in one area may become much wronger overall.
  • A recurring theme is purity spirals and “double updates”: relaxing priors for openness, then treating speculative evidence as overwhelming, especially around AI doom and exotic ethics.
  • Some see the movement as reinventing philosophy with less rigor, ignoring 2,500 years of existing work.

Are Rationalists Uniquely Bad?

  • Several commenters question the premise: any large, idealistic, intellectually self-conscious movement (religions, Objectivism, EST, New Age, fandoms) spawns cult offshoots.
  • Others argue rationalists are especially prone because they prize abstract argument over lived experience; see themselves as uniquely smart; and cluster in high-status, money-rich tech hubs.
  • There is tension between “a few small, toxic offshoots in a mostly normal scene” and “the core ideology and social style create systematic cult risk.”