The EU moves to kill infinite scrolling
Cookie popups, GDPR, and “malicious compliance”
- Many argue cookie banners are a self‑inflicted UX disaster: GDPR only requires consent for non‑essential tracking, not for basic session/login cookies, yet risk‑averse legal teams demand banners “just in case.”
- Several examples: government sites and companies that don’t track still show banners; hosted sites get forced banners because lawyers can’t guarantee what customers embed.
- Others stress the real problem is dark‑pattern consent flows and poor enforcement: popups that default to tracking or bury opt‑outs clearly violate GDPR’s intent.
- Some call the cookie/cookie‑law approach “fundamentally stupid,” saying browsers should handle tracking control (e.g., honoring Do Not Track or a browser‑level opt‑out) instead of pushing UX onto every site.
Regulating infinite scroll and addictive design
- Supporters see infinite scroll + autoplay + engagement‑optimized feeds as deliberately addictive, comparable (in kind if not degree) to sugar, gambling, or tobacco; they welcome regulation to protect children and “the weak,” not just self‑controlled power‑users.
- Critics frame this as paternalism and an attack on personal responsibility: “just don’t install the app,” “turn off your phone,” and worry about a slippery slope (games, Netflix, even chess next?).
- Many say infinite scroll alone is a distraction: the real harm is algorithmic, personalized feeds optimized for watch‑time and radicalization, not whether content is paginated.
Addiction, free will, and societal costs
- Long sub‑discussion compares social-media use to addictions (smoking, heroin, sugar, casinos). One side emphasizes how hard “just stop” is; the other insists laws shouldn’t be built around “trivial mental illnesses.”
- Some argue governments already regulate addictive products (tobacco, opioids), and engineered digital addiction should be treated similarly.
- Others counter that equating doomscrolling with lethal drugs is dangerous overreach and risks broad censorship/behavior control.
Advertising and business models as root cause
- A large contingent claims online advertising—especially behavioral targeting—is the real driver of dark patterns and addiction (“more time in app = more ad revenue”).
- Proposals range from:
- banning or heavily taxing internet ads;
- banning paid promotion (compensated advertising) rather than speech itself;
- banning personalized/behavioral targeting while allowing contextual ads;
- taxing ad‑driven engagement time directly.
- Objections: who funds “free” content and small businesses? Would the result just be fragmented subscription silos and stronger incumbents? How do you even define “advertising” or “targeting” without huge loopholes?
EU regulatory style and enforcement worries
- Some praise the EU’s “intent‑based” approach: broad rules against “addictive design,” enforced case‑by‑case (via DSA/GPDR‑style mechanisms), instead of brittle, easily gamed technical bans (e.g., “no infinite scroll element”).
- Others see vague “vibes‑based” rules as dangerous: they create legal uncertainty, allow selective enforcement against disfavored firms, and resemble tools for political leverage more than citizen protection.
- There’s debate over whether the EU is genuinely protecting users or mainly creating powerful levers over large (often foreign) platforms, with ordinary individuals having limited ability to invoke these laws themselves.