EU proposal to scan all private messages gains momentum

Perceived Threat to Privacy and Democracy

  • Many see the proposal as “police state” overreach and a step toward authoritarianism, incompatible with liberal democracy and individualism.
  • There is strong resentment that opaque, weakly accountable EU institutions want citizens’ communications to be “transparent,” with fears it will be near‑irrevocable once passed.
  • Some argue this will create a system where “everyone is guilty by default” and tools can later be repurposed against dissent, not just CSAM.

Effectiveness Against CSAM and Criminals

  • One side: scanning is effective because many offenders are careless; Meta’s huge volume of CSAM reports is cited as evidence people don’t bother with OPSEC.
  • Other side: serious abusers will adapt or already use secure channels; mass scanning will mostly burden innocents while not stopping determined offenders.
  • Debate over whether access to CSAM increases or reduces abuse remains unresolved; participants trade anecdotes and partial studies, but acknowledge causality is unclear.

Circumvention and Technical Feasibility

  • Numerous simple circumvention ideas are discussed: VPNs, alternative E2E apps, image “mangling” (XOR, re-encoding as files), split-key schemes across platforms, ephemeral RAM-only chats.
  • Several note that enforcement ultimately implies locked‑down, attested devices and banned unofficial clients; others think lawmakers don’t grasp the technical implications.

False Positives and Collateral Damage

  • A widely cited case where medical photos of a child triggered a CSAM flag and permanent account loss is used as a concrete warning.
  • Concerns focus on “guilty despite innocence”: automated flags leading to police investigations, account bans, or even child-removal based on misclassified family photos.

Platforms, Telegram, and Provider Responsibility

  • Some in anti-malware work call Telegram de facto criminal infrastructure that ignores clear abuse reports and law‑enforcement requests, arguing that known criminal channels should be taken down.
  • Others respond that undermining encryption or blanket scanning is the wrong remedy; if a service won’t comply with lawful orders, target the company (up to bans), not everyone’s messages.

Public Opinion, Politics, and EU Institutions

  • Multiple commenters say most EU citizens are unaware; mainstream media and national parties rarely highlight the issue.
  • Others counter that voters broadly support child‑protection and age‑verification measures and trust governments more than “Big Tech,” so this is democracy, not just technocracy.
  • Some expect EU courts (EUCJ) to strike down large parts, but worry each cycle of overreach gradually shifts power away from judicial checks.

Client-Side Scanning and Device Control

  • Client‑side scanning is viewed by many as worse than weakening encryption, effectively malware embedded in devices.
  • Fear that this will criminalize non‑official clients and drive a push for locked‑down hardware and OS-level surveillance.

Related Measures (Age Verification, Porn)

  • A late amendment mandating “robust” age verification for online porn, with prison terms for non‑compliant sites, is highlighted as a major, under‑reported parallel development.