AI uBlock Blacklist
Scope and Purpose of the List
- New list targets “AI slop” / content farms specifically, rather than all AI-related sites.
- Unlike some broad anti‑AI lists tuned to hide search results (e.g., via uBlacklist for Google/DDG/Bing), this one blocks at the network level so any visit triggers a visible warning while preserving user override.
- Several commenters like this approach for clearer user agency and a more “grounded” focus on deceptive, low‑quality SEO farms.
Broad Anti‑AI Lists vs Focused Blocking
- Critics of broader lists say they behave like a “hater list,” catching anything AI‑adjacent (e.g., tools like ChatGPT) rather than just AI‑generated content that pollutes search.
- Defenders respond that their goal is precisely to remove AI‑generated material from search because they never want that as a search result; if they wanted AI output, they’d go directly to an LLM.
- Some users share their own SEO‑farm hunting tactics (e.g., shared footer patterns, spreadsheets of networks) and feed those into custom blocklists.
Assistive Tools vs AI Slop
- Side discussion distinguishes between:
- Assistive tools (Grammarly, translation, LLM help for non‑native speakers or people with writing difficulties).
- Mass‑produced AI content farms and “AI‑polished” slop.
- Multiple anecdotes describe coworkers using Copilot/LLMs to send verbose, generic emails that dilute clarity and damage their professional reputation.
Governance, False Positives, and Maintainer Attitude
- A major concern is the repo FAQ’s “Cry about it” answer to “My site is on your list,” plus stated reluctance to delist domains even after ownership changes.
- Critics:
- Call this unprofessional for a public list and warn of a “one‑way reputational blackhole,” especially as domains get sold or sites pivot.
- Note real experiences where poorly maintained blocklists silently broke personal sites with no recourse.
- Point to entries like AP News as examples of questionable inclusions, sometimes inherited from SEO company documents.
- Defenders:
- Emphasize it is explicitly a personal list; users are free to fork or edit locally.
- Argue that engaging with every SEO operator claiming innocence is an unsustainable “mental denial‑of‑service”; “ban first, users adjust if needed.”
- Some suggest longer‑term sustainability would require integration into established projects (e.g., Easylist‑style) with mature maintenance processes; others say short‑lived utility is fine.
Effectiveness and Arms Race
- Several commenters see this as useful but ultimately limited: AI/human content is merging quickly, making domain‑level blocking a blunt, non‑scalable instrument.
- Others note evolutionary pressure: detection‑based blocking incentivizes AI systems and slop‑farm operators to become harder to distinguish from humans.
- One view: once a “legitimate” site is heavily spammed with AI slop, it effectively ceases to be legitimate for practical browsing purposes.
Alternatives: Whitelists and Quality Signals
- Proposals include:
- “Greenlists” / whitelists of high‑quality, mostly human‑authored sites, possibly with visible tags in search results.
- Periodic re‑review of whitelisted/blacklisted sites to reflect content changes.
- Broader, possibly publicly funded content‑reputation infrastructure (akin to enterprise URL reputation services) to improve everyday browsing without per‑user Pi‑hole setups.
Related Tools and Ecosystem
- Mention of other tools:
- Existing uBlock Origin “AI widget” list.
- botblock.ai for detecting AI replies on Twitter/X (with skepticism about accuracy).
- tropes.fyi for flagging likely AI‑generated text and inferring prompts.
- Some report immediate subjective benefits (“Firefox already feeling more responsive”) when enabling such blocklists.
Terminology and Broader Adblocking Context
- Side debate over terms like “blacklist” vs “blocklist” and “master” vs “main”:
- One side sees renaming as empty virtue signaling.
- The other argues language shapes bias; if neutral alternatives exist, using them is low‑cost and more inclusive.
- Meta discussion on adblockers: some users rely less on them due to curated browsing habits; others cite millions of blocked requests, DNS‑level blocking, and argue that powerful, user‑controlled blocking should be a native browser feature.