Facts don't change minds, structure does

Beliefs as Structures, Not Isolated Facts

  • Many commenters agree with modeling beliefs as interconnected graphs: new facts tug on multiple links, and people resist changes that would destabilize large parts of the structure.
  • Single contradictory facts often shouldn’t flip major beliefs (e.g., one fraudulent climate paper vs. a huge evidence base).
  • People develop “epistemic learned helplessness”: after seeing clever but conflicting arguments, they rationally adopt a defensive stance against being persuaded.

Emotion, Identity, and Tribal Dynamics

  • Beliefs are tightly bound to identity, tribe, and self‑interest; attacking a belief can feel like attacking someone’s community or self.
  • Examples: anti‑vax narratives framed as “protecting your kids from evil outsiders”; climate and evolution framed as value conflicts, unlike relativity or chemistry.
  • Several argue both left and right use fear, disgust, and out‑group framing; others see contemporary right‑wing messaging as especially organized and authoritarian.
  • Trauma and insecurity make low‑information, high‑satisfaction conspiracies attractive (wildfires as “space lasers”, etc.).

Media, Algorithms, and Propaganda

  • Older corporate media selected “relevant” facts; social feeds now optimize for engagement, exposing people to highly curated, unrepresentative slices of reality.
  • Lying often happens via selective curation and framing rather than outright falsehoods (Chinese robber fallacy).
  • Discussion of state‑backed “troll” and “goblin” operations that game algorithms via engagement rather than direct messaging; disagreement over how impactful such efforts really are.

Science, Evidence, and Rationality

  • Long vaccine subthread: everyone acknowledges vaccine injury exists, but argue over risk assessment, burden of proof, and when skepticism becomes irrational.
  • Some note humans are poor at statistical thinking and overweight rare harms vs. common disease risks.
  • Debate on how much scientific fraud or non‑replication (in some fields) should downgrade trust in entire evidence bases.
  • Extended correction of the standard Galileo vs. Church story: more nuanced, partly political, but still used as a powerful narrative trope.

Changing Minds and Persuasion

  • Facts alone rarely change minds; emotionally validating, structurally compatible arguments (e.g., Rogerian approaches) work better.
  • Anecdotes of deep belief change (e.g., leaving extremism) show it’s possible but extremely labor‑intensive and unscalable.
  • Fact‑checking can harden both sides by reinforcing in‑group trust and out‑group distrust rather than shifting interpretations.

Critiques of the Article and Model

  • Some find the node/edge distinction fuzzy, climate‑change graph unconvincing, and the Russia‑centric part weakly connected to the earlier theory.
  • Others say the piece re‑derives points long explored in philosophy of science (Peirce, Kuhn, Feyerabend) without engaging that literature.
  • Minor complaints about AI‑like style, heavy em dashes, and distracting interactive diagrams.

Institutions and Trust

  • Several emphasize that trust in institutions (statistics bureaus, regulators, geological surveys) supplies “structural” support for facts.
  • Open question: how to build high‑trust, apolitical information sources in an environment saturated with competing narratives and incentives.