TikTok 'directs child accounts to pornographic content within a few clicks'
Experiences with TikTok Content
- Many commenters say they have never seen explicit nudity or “literal porn” on TikTok despite long-term use; they mostly see “thirst traps” and suggestive but clothed content.
- Others report encountering outright porn very quickly on TikTok, Bluesky, X, or Facebook Shorts, even without likes or follows, suggesting scroll time alone is a strong signal.
- Some note that kids/teens click on things adults would ignore and react more strongly to sexual content, so their feeds may evolve differently.
How TikTok’s Algorithm Targets Users
- Commenters outline that TikTok uses many signals: age, device, location/IP, contacts, search history, link opens, and especially watch/scroll time.
- One view: if you claim to be 16 on an Android phone, you’ll see what similar nearby 16-year-old Android users watch.
- This makes it hard to define a “natural” algorithmic baseline; recommendations reflect complex feedback loops.
Global Witness Study & Article Credibility
- Method: fake 13-year-old accounts, restricted mode on, clean phones, then following TikTok’s suggested search terms and “you may like” prompts.
- Critics say the researchers were actively hunting for edge cases using obfuscated “in the know” search terms, generating outrage from rare paths rather than normal experience.
- Others counter that some sexualized suggestions appeared immediately, that content then escalated to explicit porn, and that for children, any path to porn in restricted mode is unacceptable.
- Several doubt the claim because they personally cannot find porn, and the published screenshots show mostly bikinis and mild NSFW scenes; the porn examples are withheld.
Is Sexualized Content Harmful for Teens?
- One side: even “just thirst traps” contribute to hypersexualization, warped body image for both sexes, and unhealthy parasocial dynamics (e.g., OnlyFans funnels, “simps”).
- Other side: sexualized-but-clothed content is akin to past Playboy/lingerie exposure, not inherently harmful; burden of proof lies with those demanding restrictions.
- There is debate over conflating sexy imagery with pornography and whether 13–17-year-olds seeing such content is actually problematic.
Moderation, Law, and Practical Limits
- Some argue child accounts should have zero access to porn under any search term; others say this is technically impossible at TikTok scale without destroying the business.
- Back-of-the-envelope calculations suggest human pre‑moderation of all uploads would cost billions annually and still be imperfect.
- Comparisons are made to Disney (fully controlled content) vs user‑generated platforms; critics of TikTok treat them as equivalent, which others call unrealistic.
- The UK Online Safety Act’s requirement to “prevent” harmful content is seen by some as far beyond “reasonable measures.”
Broader Platform & Political Context
- Multiple commenters note that Instagram, Snapchat, X, and Facebook expose users (including kids) to similar or worse sexual and harmful content (e.g., vapes, drugs, cruelty).
- Some see the TikTok focus as part of a geopolitical and lobbying campaign: the “national security” narrative failed, so now it’s “think of the children.”
- Others defend scrutiny from human-rights groups, linking platforms to propaganda, misinformation, and psychological harm.
Parenting, Phones, and Society
- Several describe being shocked by Snapchat’s front-page content and peer pressure that makes opting kids out socially costly.
- Suggested responses include: dumb phones, saying “no” even if it causes ostracism, and stricter regulation of child-facing feeds.
- A number of commenters see algorithmic social media as a major societal harm comparable to cigarettes or leaded gasoline.