Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)

Scope and intent of the UK Online Safety Act

  • Act imposes a “duty of care” on user‑to‑user services to tackle illegal and some “legal but harmful” content, with fines up to £18M or 10% of global turnover.
  • Ofcom guidance requires risk assessments across 17 “priority” harm areas, annual review, documented policies, reporting mechanisms, and moderation workflows.
  • Debate over thresholds: some say small sites have only a subset of obligations; others note many duties still apply below “large service” thresholds and find wording vague (e.g., “significant number of UK users”).

Burden on small / volunteer‑run communities

  • Many see the law as de facto hostile to small forums, which lack legal teams and full‑time compliance staff.
  • Forum operators describe this as turning a hobby into unpaid compliance work (policies, training, logs, CSAM scanning, appeals), with personal risk if they operate as individuals.
  • Some argue this repeats the GDPR/VATMOSS pattern: same rules for tiny operations and massive platforms, driving consolidation into Big Tech (“regulatory capture”).

Risk, enforcement, and “digital swatting”

  • One side: UK regulators (ICO/Ofcom) are typically proportionate; maximum fines are reserved for egregious large‑scale offenders; small operators mostly get guidance.
  • Other side: the mere possibility of life‑altering fines plus vague standards is enough to chill participation, regardless of likely enforcement.
  • Specific fear: disgruntled users or organized raids could upload illegal content (especially CSAM), then report the site, creating a “digital swatting” vector even if moderation is normally diligent.

Moderator safety and burnout

  • Multiple anecdotes of moderators and small‑forum admins facing death threats, doxxing, physical harassment, and DDoS attacks.
  • New legal exposure is seen as an additional, non‑technical attack surface layered on top of already hostile dynamics, pushing some to shut down rather than continue.

Mitigations and alternatives discussed

  • Suggestions: incorporate as a UK limited company or CIC for limited liability; share moderation; disable DMs; stricter onboarding; auto‑hiding reported posts; CSAM hash scanning (e.g., via Cloudflare).
  • Others note incorporation adds its own paperwork and costs, and does not remove all personal risk in extreme cases.
  • Some propose offshoring or anonymous hosting, but extraterritorial application (“linked to the UK”) and ethical/legal concerns make this unclear.

Broader reflections

  • Many view this as part of a wider trend: governments, nudged by big‑tech lobbyists and “protect the children” framing, tightening control over online speech and unintentionally (or intentionally) killing independent communities.
  • Strong sense of loss over long‑running forums shutting down and the internet becoming more centralized, corporate, and bureaucratic.