What even is 'adult' content? [NSFW]

NSFW, “adult,” and pornography

  • Many argue the article deserved an NSFW tag even if the image (nude pregnant woman) isn’t pornographic; “NSFW” is about workplace norms, not intrinsic sexual content.
  • Others note museums, history books, sculpture, and medical illustrations routinely depict nudity and would be absurd to treat as “porn,” yet in many offices any visible nudity is risky.
  • Some think the article cherry‑picks edge cases to imply porn is undefinable, instead of acknowledging a large, obvious universe of explicit sexual material.

Is nudity inherently sexual?

  • Strong split: some say any nudity is clearly sexual; others distinguish “being nude” from “sexually suggestive,” pointing to saunas, nudist culture, medical contexts, and family norms.
  • Several emphasize that arousal is subjective: if a viewer is turned on, that doesn’t automatically make an image “sexual” for policy purposes.
  • Fetishes (pregnancy, armpits, balloons, etc.) are cited to show that if “anything someone eroticizes” counts as sexual, almost everything would.

Culture, religion, and norms

  • One camp blames Western/Abrahamic religious taboos for shame around nudity; critics argue all large societies regulate public nudity, so it’s broader than religion.
  • Examples given: European topless beaches, mixed saunas, Freikörperkultur vs. more prudish US norms (e.g., breastfeeding controversies).

Protecting children vs censorship and overreach

  • Many agree kids shouldn’t have unguided access to extreme sexual content, but doubt technical measures (filters, NSFW flags, ISP blocks) meaningfully stop motivated teens.
  • Some stress ongoing parental conversation over technical controls; others want ISP‑level porn blocking lists and possibly “healthy porn” vs “aggressive porn” distinctions.
  • Skeptics warn effective blocking will mainly push youth toward shadier sites and give governments pretexts to extend control (e.g., over VPNs, broader speech).

Age verification and digital identity

  • Major concern: sending passport scans or other sensitive IDs to “sleazy websites,” especially under UK’s Online Safety Act.
  • Some advocate modern cryptographic digital IDs and selective‑disclosure credentials as a more privacy‑respecting solution, but note such systems are not yet widely deployed.
  • Regulators’ suggested methods (photo‑ID matching, facial age estimation, open banking checks, email‑based estimation, etc.) are seen as intrusive, immature, and likely outsourced to third‑party verifiers.

Censorship, politics, and “adult” as a control label

  • Commenters highlight how “adult” labeling can be used to suppress LGBTQ+ content, historical works like Maus, and other non‑sexual but politically sensitive material.
  • Recent platform actions (e.g., itch.io’s adult‑content removals) are cited as examples where LGBT‑tagged but non‑explicit content disappeared.

Porn, sex work, and misogyny

  • Discussion around Instagram “breastfeeding porn” and OnlyFans touches on how policies aimed at nuance get exploited.
  • Some see hostility to OnlyFans as heavily gendered and moralizing, treating women in sex work as degraded while ignoring male‑run porn industries.
  • Others insist sex work is inherently degrading and not comparable to other “unglamorous” but socially accepted jobs; counter‑arguments stress bodily risk in male‑dominated jobs and autonomy of sex workers.

Violence vs sexual content

  • Multiple comments note the inconsistency that graphic or normalized violence (crime shows, Star Wars, slapstick cartoons) is widely accessible, while consensual sex and nudity are more tightly controlled.
  • Some argue “graphic violence” for kids and teens is at least as, or more, harmful than non‑violent sexual content; others respond that both need careful age‑appropriate handling.

Definitional fuzziness and policy defaults

  • One long critique says fuzzy boundaries (sorites‑style) don’t mean categories are useless; we routinely regulate with imperfect lines (food safety, medicines, clothing norms).
  • From that view, when we “can’t perfectly decide,” default‑block can be safer than default‑allow, analogous to modern computer security hardening.
  • Others push back that leaning on edge cases to deny all regulation is as misguided as using fuzziness to justify broad censorship.