Discord's face scanning age checks 'start of a bigger shift'

Accuracy and Bias of Face-Based Age Checks

  • Many doubt AI can reliably distinguish, say, 17y11m from 18, or handle young‑looking adults, early or delayed puberty, and medically or hormonally atypical users.
  • Examples include adults routinely mistaken for teens and teens who look much older, plus reported failures of existing KYC face systems.
  • Concerns about racial bias and lighting (e.g. Black users needing many more attempts) and about trans users or people on puberty blockers being misclassified.
  • Commenters note edge cases aren’t rare, so false negatives could be common and hard to appeal.

Privacy, Biometrics, and Surveillance Fears

  • Strong worry that “we don’t store your face” is a semantic dodge: embeddings or “biometric metadata” can still uniquely identify people and be leaked, sold, or subpoenaed.
  • Many see this as normalization of biometric collection and the erosion of online anonymity, with parallels drawn to RealID, SIM registration, and face‑scanning borders.
  • Some fear a trajectory toward mandatory real‑ID for all internet use, centralized databases exploitable for blackmail or targeting minorities, and a broader “compliance‑industrial complex.”

Parents vs State: Who Should Protect Kids?

  • One camp says this is parental responsibility; laws are overreach and a power grab disguised as child protection.
  • Others argue parenting alone is unrealistic: social ostracism if kids are kept off platforms, widespread harms from algorithmic feeds, and analogies to age limits on alcohol, cigarettes, porn, and driving.
  • There’s tension between wanting to shield kids from porn and social‑media harms and not accepting mass ID checks or biometrics as the price.

Effectiveness and Workarounds

  • Many predict trivial circumvention: VPNs, older friends’ faces, stolen or shared IDs, AI‑generated faces, or moving to less‑regulated chat systems.
  • That leads some to label this “security theater” that mainly increases data collection while pushing kids to darker, less moderated corners of the net.

Legal and Structural Concerns

  • Commenters mention UK, Australian, US state laws and court cases as part of a broader, messy legal push on age verification that may be designed to be impossible to fully comply with.
  • Some small service operators plan to geoblock the UK rather than implement complex checks, expecting this might eventually force policy rollback.

Proposed Technical Alternatives

  • Ideas floated: device‑local age estimation with attestation tokens, bank/government‑issued verifiable credentials (e.g. OpenID‑based), national e‑ID systems, or simple content‑rating headers plus device/ISP filters.
  • Critics respond that coordination, incentives, issuer trust, and eventual misuse (for broader discrimination and tracking) make these “clean” crypto solutions fragile in practice.

Impact on Platforms, Users, and Society

  • Some see this as accelerating a shift off Discord (especially for open‑source projects) and entrenching big incumbents and specialized compliance vendors.
  • Others emphasize collateral damage: cutting abused, queer, or isolated teens off from vital online support; excluding people who can’t easily be verified; and further normalizing pervasive identity checks online.