I use zip bombs to protect my server

Ethics and Intent

  • Some see serving zip/gzip bombs to abusive bots as justified self‑defense: “no ethical ambiguity” in sending garbage to clearly malicious traffic, especially given widespread spam, exploits, and AI scrapers.
  • Others argue “two wrongs don’t make a right,” likening it to vigilante justice or booby traps (mantraps), which are often illegal even against trespassers.
  • Middle ground: it’s ethically fine if it only hits clearly malicious traffic, but blocklists and VPN IPs mean innocent users can share “bad” addresses and suffer collateral damage.

Legal Ambiguity

  • Several comments raise the US CFAA: intentionally transmitting code that causes damage to another computer could be illegal, even if that machine initiated the connection.
  • Counter‑arguments: clients “request” the content; a zip bomb is non‑destructive (reboot fixes it); robots.txt could signal “keep out.”
  • Consensus: no known cases of crawlers suing over a zip bomb; risk is theoretical but non‑zero, and misclassification of legit bots/users is a key concern.

Effectiveness vs Alternatives

  • Some report zip bombs failing against robust scrapers (e.g., Amazon’s), which simply retry or cap download size.
  • Others worry about provoking DDoS retaliation or wasting their own CPU/bandwidth if traps are too heavy (e.g., random data streams, content labyrinths).
  • Alternative countermeasures discussed:
    • WAFs with custom rules (complaints focus on poor defaults, especially on cloud platforms).
    • IP/ASN/regional blocking, fail2ban/CrowdSec, and Cloudflare‑style CAPTCHAs/Turnstile.
    • TCP tarpits and tools like endlessh for SSH; slow chunked HTTP responses.
    • Honeypot form fields and hidden links that only bots see.

Zip Bomb Mechanics and Defenses

  • This setup is mainly a gzip bomb via HTTP Content-Encoding, not nested ZIP-on-disk.
  • Technical notes:
    • Enormous compression ratios using zeros with gzip, Brotli, or zstd; nesting can amplify further, but many clients only auto‑decompress one layer.
    • Modern browsers generally just crash the tab/process when memory is exhausted.
  • Defenses suggested:
    • Enforce limits on decompressed bytes, files, CPU time, and memory (streaming decompress; cut off after N bytes).
    • Use cgroups, small temp partitions, or containers to isolate decompression.
    • Some AV/EDR products already skip scanning after size/ratio thresholds; this itself can be exploited.

Operational and SEO Risks

  • Zip bombs can trip malware filters and Safe Browsing: one reported case where a hosted zip bomb got an entire domain flagged and broke unrelated Tor infrastructure.
  • Robots.txt is proposed to keep “good” crawlers (Google/Bing) away from trap URLs, though many bad bots ignore it or treat it as a target.