String of recent killings linked to Bay Area 'Zizians'
Initial reactions and “Vegan Sith” ideology
- Many commenters reacted with disbelief and dark humor to the “vegan Sith” framing, calling it something that would normally sound like parody.
- Several linked the group’s extremism to a broader “post‑ironic” internet culture where movements adopt absurd aesthetics (e.g. Boogaloo Boys) while being deadly serious.
- Some argued the whole thing feels like an LLM hallucination made real: Star Wars + veganism + AI apocalypse + tactical cosplay + murder.
Mental illness, delusion, and responsibility
- Long subthreads debate when mental illness absolves responsibility.
- Some stress that many severe conditions legally impair judgment; others argue people can be mentally ill yet still know right from wrong.
- Commenters note a common pattern: people who function “normally” in daily life but hold highly delusional or fringe beliefs, sometimes tipped into action by reinforcing peer networks.
- There is skepticism about the DSM and diagnosis as tools of social control versus genuine tools for care.
Cult dynamics, techniques, and Ziz’s philosophy
- Multiple links describe Zizians as a classic high‑control cult: cutting members off from friends/internet, demanding ideological purity, and using dense jargon to shape thought.
- Indoctrination methods reportedly include extreme sleep deprivation and so‑called “unihemispheric sleep” to induce dissociation and multiple “demons”/personas. Several commenters treat this as both unethical and likely psychologically destabilizing.
- Ziz is described (via secondary sources) as combining hardline vegan utilitarianism, extreme decision‑theory (never back down; escalate retaliation), and AI eschatology (future AIs punishing moral “failures” like eating meat).
- Some note the internal logic is “nerd philosophy” taken to an adolescent, pulp-fiction extreme.
Connections to Rationalism, EA, and AI doom
- Many see the group as a fringe offshoot of Bay Area rationalist / effective altruist circles, but not representative of them.
- Others emphasize longstanding issues: cult‑like institutions around rationalism (CFAR, MIRI, Leverage), AI doomerism framed like a secular “Rapture of the nerds,” and charismatic leaders with grand moral projects.
- Commenters worry that apocalyptic AI rhetoric plus insular communities could normalize violence “for the greater good,” even if most participants just write thinkpieces.
Trans identity, recruitment, and media framing
- A major thread contests whether it’s relevant that many Zizians are trans and that the group appears to have targeted trans people for recruitment.
- Some argue omitting this in mainstream coverage is biased or evasive; others say centering it fuels moral panic and right‑wing narratives that conflate trans identity with danger.
- Several note that marginalized and traumatized groups (including trans and neurodivergent people) can be especially vulnerable to cult recruitment and “break free from your mental cage” messaging.
- There is pushback against far‑right outlets framing this primarily as “trans terror,” and against using the case to smear all trans people.
Rationalist community’s internal response
- People involved with LessWrong/rationalist spaces state Zizians were warned about and banned from events and platforms years ago.
- They frame the connection as: rationalist culture is unusually welcoming to weird ideas and people, which helped the cult form on its fringe, but the killings reflect a splinter group that had long since been ostracized.
- Others counter that rationalist/EA ecosystems repeatedly incubate extreme offshoots (SBF, neoreactionary subcultures, The Motte, etc.), so something in the culture—status around “being rational,” apocalyptic stakes, IQ obsession—bears scrutiny.
Broader critiques of “rationalism”
- Several commenters argue that what’s called “rationality” is often elaborate rationalization: chaining abstract arguments far beyond available evidence, especially with infinities and tiny probabilities (Pascal’s‑mugging‑style reasoning).
- They note that trying to reason everything “from first principles” without constant empirical grounding can push smart but unstable people into self‑consistent madness.
- Some see rationalist spaces as attractive to socially isolated, highly online people (including many trans and autistic folks), which can amplify feedback loops rather than correct them.
Media, politics, and risk going forward
- Commenters highlight selective coverage: mainstream outlets emphasizing cult and AI/vegan angles; right‑wing ones emphasizing “trans terror”; and almost everyone ignoring the detailed philosophical backstory.
- There are calls to distinguish:
- cult ideology and methods,
- rationalist/EA/AI‑safety ideas in general, and
- trans identity or other demographics,
to avoid either sanitizing or scapegoating.
- Some worry the case will be weaponized: against trans people, against any AI‑risk discussion, or against “too much thinking” in general, rather than prompting more nuanced reflection on how high‑intensity intellectual subcultures can go very wrong.