X offices raided in France as UK opens fresh investigation into Grok
Allegations Against X/Grok and Legal Scope
- Many commenters view Grok as a “CSAM machine,” citing widespread reports of it undressing minors or realistic minors and generating sexual deepfakes, often based on real photos, and X publicly distributing the results.
- Others push back that there is no clear evidence Grok generated CSAM in the legal sense, and note the French prosecutor’s initial statement didn’t explicitly use “CSAM,” but instead referenced:
- Pornographic images of minors
- Sexually explicit deepfakes and image‑rights violations
- Holocaust denial content
- Manipulation of automated data processing
- Fraudulent data extraction by an organized group
- Several expect investigators to seek internal emails, moderation policies, metrics, risk warnings, or decisions prioritizing engagement over safety, not a “Grok CSAM Plan” folder.
What Counts as CSAM? Real vs AI‑Generated
- Large subthread on definitions:
- One side: CSAM = record of actual child sexual abuse; AI deepfakes (even of real minors) are abusive but legally distinct and not CSAM in most current law.
- Others argue many jurisdictions (e.g., Sweden, Japan, at least in some cases) treat sexualized images of minors, including drawings or AI edits, as illegal and sometimes equivalent to CSAM.
- Debate over whether undressing a child via AI is “just” image abuse or child abuse in itself, with some noting real-world harms like bullying and suicides after deepfake circulation.
- There’s disagreement and confusion about national legal standards, translations, and whether newer CSAM definitions “dilute” the term.
Free Speech, Censorship, and Cultural Diversity
- Some argue this is not a speech case but straightforward enforcement against illegal content (CSAM, Holocaust denial in France, fraud, data violations).
- Others frame it as part of broader state control over platforms and speech; worry about raids as “political pressure” or attacks on a political dissident.
- A few welcome heterogeneous national standards as a safeguard against global monoculture; others counter that censorship reduces diversity and mainly entrenches those in power.
Use of Social Platforms by Public Institutions
- Strong criticism of prosecutors moving from X to LinkedIn/Instagram: still US‑owned, closed, algorithmic, and not public‑service‑oriented.
- Several argue governments should prioritize open, auditable channels (websites, RSS) and treat commercial platforms as secondary distribution.
Raids, Data, and Enforcement Reality
- Discussion of what a raid on a satellite office yields:
- Seizure of workstations, local mail caches, documents, and credentials; potential leverage over employees as witnesses.
- Counterpoints that everything is encrypted or cloud‑hosted, with speculation about “kill switches” vs the legal risk of destroying evidence.
- France is described as unusually raid‑happy for white‑collar and tech investigations compared to other Western countries.
Broader Political and Corporate Context
- Some see Musk/X as destabilizing Europe and pushing far‑right narratives; others warn banning platforms outright would be authoritarian.
- Concern that folding xAI into SpaceX could entangle a key US defense contractor in EU legal jeopardy, complicating future contracts and a SpaceX IPO.