Using Cloudflare on your website could be blocking RSS users

Impact on RSS and feed readers

  • Many RSS readers receive 403s or Cloudflare challenges on feeds, often intermittently and without clear reason.
  • This affects both small blogs and large sites (including Cloudflare‑hosted blogs and government sites).
  • Feed reader operators report high support load and some users abandoning feeds or entire sites when they break.
  • Even “verified” or well‑behaved readers can be blocked; allowlisting individual IPs or user agents scales poorly.

Cloudflare behavior and configuration

  • Blocking often comes from features like Bot Fight Mode, Browser Integrity Check, generic bot protection, and WAF rules.
  • RSS feeds sometimes have Cloudflare‑injected scripts (email obfuscation, challenges) that break machine consumption.
  • Cloudflare appears to rely heavily on Content-Type to detect RSS; many feeds use generic XML types, so checks fail.
  • High security levels or aggressive bot settings can override “good bot” status and still produce 403s.

Mitigation strategies suggested

  • Whitelist RSS endpoints by URL/path or subdomain, not by user agent; disable bot protection only there.
  • Use page/configuration rules to: turn off bot checks, relax browser integrity, and enable strong caching for feeds.
  • Separate subdomains (e.g., www for humans, feeds for bots, audio mixed) to tune security per traffic type.
  • Use caching plus rate limiting to handle abusive scrapers instead of blanket blocking.
  • Some recommend third‑party proxies (e.g., feed aggregation services, scraping APIs) to bypass Cloudflare.

Responsibility and defaults debate

  • One view: site admins misconfigure Cloudflare; it’s doing what they asked (block non‑human traffic).
  • Opposing view: Cloudflare’s defaults and UX “make it easy to break the web,” so Cloudflare shares responsibility.
  • There is disagreement over how important RSS is today, but several argue it underpins podcasts and independent publishing.

Privacy, anti‑bot arms race, and collateral damage

  • AI crawlers, spam, and abusive bots are cited as major drivers of stricter defenses.
  • Collateral damage includes RSS, Tor, VPN users, Firefox with privacy features, custom user agents, and niche browsers.
  • Users report endless captchas, opaque 403s, referer‑based blocking, and quietly giving up on affected sites.

Proposed product improvements

  • Auto‑detect RSS/Atom/sitemaps and relax bot checks by default.
  • Provide simple UI switches for “allow automated access to feeds/sitemaps.”
  • Add better block‑page feedback (“misclassified” or contact links) and analytics so site owners see false positives.