How much do I need to change my face to avoid facial recognition?

Technical effectiveness and evasion methods

  • Former FR engineer: most real-time systems use a first-pass “generic face” detector; if you fail that (e.g., extra eyes, distorted features), you’re effectively invisible to the system but very conspicuous to humans.
  • Simple occlusion (masks, sunglasses, hats) remains highly effective, especially for public CCTV without depth sensors.
  • Extreme makeup (e.g., “juggalo,” CV Dazzle) historically worked by breaking facial landmarks, but commenters suspect modern models are now trained against such patterns.
  • Others suggest prosthetics, tattoos, eye-shaped stickers, IR LEDs, or religious face coverings; many note these either draw human suspicion or likely trigger security intervention.
  • Some mention gait recognition as an emerging or existing complement to facial recognition, harder to fool but also easier to alter consciously.

Real-world deployments and normalization

  • Airports and borders: multiple stories of automated gates and live face matching replacing manual checks, including systems that track passengers throughout terminals and flag “lingering.”
  • Some users describe being shown compiled movement footage after an incident, suggesting real-time tracking and easy retrospective retrieval.
  • Workplace and retail surveillance: systems log employee entry/exit, plate recognition, clothing color queries, and behavior analytics; video already used to resolve disputes and detect internal theft.

Limits, error rates, and scale problems

  • Several point out that facial recognition is highly effective in constrained contexts (airport gate, known time/location) but struggles at national scale due to false positives.
  • Even low error rates (0.1–1%) become operationally overwhelming when millions pass through major hubs daily.
  • Claims of very high accuracy coexist with reports of practical false positives and wrongful matches; some note courts and authorities often over-trust biometric “matches.”

Privacy, law, and societal impact

  • Strong concern about mass, asymmetrical surveillance: “they” see everything; the public sees nothing.
  • Debate over whether people “have no expectation of privacy in public,” with counterarguments citing European laws and cultural norms that regulate even public-space cameras.
  • Some welcome pervasive surveillance for exculpatory evidence and crime reduction; critics respond that access is asymmetric, often unavailable to defendants, and historically used against marginalized groups.
  • Thread highlights normalization via opt-out-by-friction (e.g., border photos), creep from airports to everyday spaces, and fears of enabling authoritarian control, discrimination, or future regime changes.