Why the open social web matters now

Core problems of an open social web

  • Commenters repeatedly cite the same hard issues: moderation, spam (including scrapers), identity / “good faith” verification, and transparency around who is posting.
  • Some argue these are why open or federated systems struggle at scale more than centralized ones. Others counter that current federated systems (e.g., Mastodon instances) show these can be managed, at least at smaller scales.
  • There’s concern that decentralized protocols can expose more user data (likes, copies of posts) and make true deletion practically impossible.

Money, identity, and anti‑spam mechanisms

  • Many suggest small payments or subscriptions (even a one‑time $1–10 fee) as powerful spam deterrents and moderation aid, though others warn this doesn’t reliably keep out bad actors and shifts power to payment processors.
  • Hashcash‑style proof‑of‑work, “burning” money to boost signal, and charity‑tied tokens are floated as alternatives; critics call these wasteful or demographically skewed (those willing to pay to post may not be who you want).
  • Digital IDs (e.g., mobile driver’s licenses, EU eIDAS wallets) and pairwise pseudonyms are suggested to uniquely identify users while preserving some privacy, but people worry about centralized “cancellation” if ID providers or states control access.

Moderation, free speech, and politics

  • One faction sees more assertive moderation and “good faith verification” (e.g., weeding out harmful misinformation, fake credentials, covert bots) as necessary, even if somewhat “authoritarian.”
  • Another faction insists platforms should only remove illegal content and otherwise let users fully control their own feeds; they see broader moderation as censorship that will be wielded politically (e.g., around immigration, “white supremacy,” Nazi labeling).
  • Several note that every community’s moderation is inherently biased; open systems might at least make those biases and blocklists transparent, allowing people to leave or choose different moderation services.

Feeds, discovery, and scale

  • Chronological, follow‑only feeds (RSS, Usenet‑style) are praised for minimizing spam and avoiding algorithmic rage‑bait, but criticized for poor discovery and susceptibility to high‑volume posters.
  • Algorithmic “discovery” is blamed for slop, addiction, and echo chambers, yet many argue most users, by revealed behavior, prefer it.
  • Some propose hybrid models: user‑curated follows, shared blocklists, optional ranking based on social graph likes.

Decentralization trade‑offs and adoption

  • Several warn that decentralization is a “tar pit” of technical, privacy, and social complexity, and that many problems of centralized platforms (spam, harassment, regulation) carry over.
  • Others emphasize that centralization concentrates market and political power (platforms aligning with states), threatening smaller communities and independent speech.
  • A recurring criticism of open‑web efforts: “being open” isn’t enough; they must solve real user pain (where friends and creators are, easy UX, innovation in formats) or they will remain niche.