Meta suppressed research on child safety, employees say

Reaction to Meta suppressing child-safety research

  • Many see this as part of a long pattern: Meta learns its products harm people (especially kids), then buries or downplays findings rather than fixing problems.
  • Commenters distinguish between:
    • Merely failing to “release” research, and
    • Actively deleting recordings and written records, which is viewed as far more damning.
  • A minority argues big tech gets attacked whether it releases, leaks, or withholds research, making openness less attractive. Others respond that outrage is about the content of findings and Meta’s repeated failure to act.

Proposed protections for kids (especially in VR)

  • Suggested interventions:
    • Monitoring or recording interactions for post-hoc reporting.
    • Stronger age verification.
    • Banning underage users, or making access contingent on parental oversight.
    • More aggressive moderation and referrals to law enforcement for abusers.
  • Disagreements:
    • Some argue massive human moderation is “not scalable”; others say Meta could simply hire far more staff, analogizing to large logistics workforces.
    • Privacy vs safety tension: monitoring/ID checks may harm privacy, but several commenters feel current child harm justifies stronger measures.

Who is responsible: corporations, parents, or government?

  • One camp: core problem is profit-maximizing corporations under weak regulation; self‑regulation is called a “joke.” They call for heavy fines, executive liability, and systemic changes to shareholder‑primacy norms.
  • Another camp emphasizes parental responsibility and opposes expanding state control, arguing parents should restrict devices and teach kids. Critics respond this is unrealistic given modern work pressures, split households, and the scale/targeting of platforms.
  • Some argue blaming “the fox” (Meta) is less productive than “building a fence” via law and collective action; others insist companies still have direct moral obligations.

Social media as the “new tobacco”

  • Widespread analogy: social media platforms knowingly profit from user misery and youth mental‑health damage, similar to historic tobacco behavior.
  • Others push back that, unlike tobacco, social tools have real utility (family connection, community groups), which complicates simple bans.

Boycotts, network effects, and alternatives

  • Many urge deleting Meta accounts; others report severe social and practical penalties (missed events, family chats, local businesses, jobs) due to network effects.
  • Proposed mitigations include:
    • Moving to federated or smaller networks without engagement feeds.
    • Replacing online time with local volunteering and real‑world communities.
  • Overall mood: deep distrust of large tech firms, mixed with pessimism about how hard they are to escape.