I hacked a dating app (and how not to treat a security researcher)

Security tools and reverse engineering

  • Several comments highlight Charles Proxy as a standard, widely used tool for intercepting and reverse‑engineering mobile app traffic (akin to IDA Pro for binaries).
  • Certificate pinning is mentioned as the main barrier to using these tools on modern apps.
  • Some readers discover Charles for the first time and share their own MITM setups for inspecting or isolating devices.

Company response and disclosure duties

  • Many see the company’s initial engagement (meeting, fixing the bug) but subsequent silence as an attempt to “push it under the rug.”
  • Strong view that users should be notified, especially given the sensitivity of leaked data (passports, sexual preferences, chats, phone numbers, location).
  • Others argue the company’s only real duty is to fix the issue; informing the researcher or “the public” is framed by some as optional, though others point out breach‑notification laws likely apply.
  • There’s concern that weak or absent penalties make this “business as usual.”

Legality and risk for security researchers

  • Multiple commenters note that what the researcher did is likely illegal in many jurisdictions once they started enumerating and accessing other users’ data.
  • The Auernheimer/AT&T case and similar prosecutions are referenced as cautionary examples; intent and what data is stored or disclosed matter a lot.
  • Some advocate 90‑day public disclosure deadlines after being ignored; others warn this is a good way to get sued or criminally charged and stress getting legal advice and minimizing data collection.

Technical failings of the app

  • Returning the OTP in the API response is widely ridiculed as “wild” and symptomatic of having no security model or treating the client as trusted.
  • Likely cause is seen as naive scaffolding: serializing DB models directly to JSON, returning created rows verbatim, or leaving in test conveniences.
  • Simple ID enumeration and lack of proper access controls are noted as extremely basic, preventable mistakes, especially egregious for a dating app holding passport images and intimate data.

Responsibility, competence, and “student app” debate

  • Some argue the developers are students/junior and shouldn’t be judged as harshly as big, well‑funded companies that do similar or worse.
  • Others vehemently reject this: if you handle high‑risk PII (passports, sexuality, intimate chats), you have no excuse not to understand basic security—or you shouldn’t build or launch the product.
  • Broader discussion emerges about “move fast and break things,” shipping POCs to production, and weak organizational prioritization of security versus features and timelines.

Regulation, penalties, and professionalization

  • Many call for stronger regulation and real financial or legal consequences for mishandling PII (GDPR is cited as a partial deterrent; US law seen as weaker).
  • Suggestions include: large fines, breach‑reporting requirements with teeth, even treating PII like “nuclear waste” with near‑existential penalties after leaks.
  • A substantial subthread debates licensing or professionalization of software engineers (analogies to civil engineering, food safety); others worry about over‑regulation, gatekeeping, and unintended harm to open‑source and small developers.

Platforms, users, and systemic issues

  • Apple’s app review is criticized as “security theater”: it doesn’t and realistically can’t vet backend security, but its walled‑garden image may give users false confidence.
  • Some argue users should be more cautious about giving such apps sensitive data; others push back that blaming users is unfair and systemic protections and enforcement are needed.
  • Anecdotes from other insecure apps (e‑commerce, dating, even government systems) reinforce that similarly egregious flaws are common and often quietly patched without user notification.