TikTok users can't upload anti-ICE videos. The company blames tech issues
Perceived Censorship vs. “Technical Glitch”
- Users report being unable to upload or share anti‑ICE content and messages containing “Epstein,” while TikTok attributes it to data‑center “technical issues” and power outages.
- Many commenters treat “tech issues” as a fig leaf, comparing it to classic state‑media excuses when sensitive material is suppressed.
- Others note that some anti‑ICE videos and political content do still appear, so the pattern and scope of suppression remain unclear.
Motives Behind the Forced Sale
- One camp sees the TikTok sale as fundamentally about letting the US government and aligned oligarchs control a highly influential youth platform and suppress dissenting narratives (anti‑ICE, pro‑Palestinian, anti‑Israel, Epstein‑related).
- Another argues the original impetus was fear of CCP influence and data access, with later politicians explicitly citing pro‑Palestinian TikTok content as a reason to push the ban/sale.
- Several stress the bipartisan nature of such moves: both parties expand surveillance and speech control tools, assuming “their” side will wield them.
US vs. China: Competing Authoritarian Models
- Many draw a parallel between Douyin/TikTok censorship of Tiananmen, Tibet, Uyghurs, etc. and emerging US‑style filtering of domestic abuses, arguing the US is converging on “its own version of Chinese control.”
- Others counter that China’s control is far more total; Americans are still openly discussing these issues on multiple platforms and in mainstream outlets.
Algorithms, Propaganda, and Access
- Repeated theme: the difference between formal “access” to information and what users actually see in recommendation feeds. Even leaky filters still shape reality for heavy TikTok users.
- Some argue all large platforms (Meta, Google, X, even HN via flags) already skew and bury topics; TikTok is just another propaganda machine, with the operator swapping from CCP to US billionaires.
- Discussion of “algospeak” (e.g., “unalived”) illustrates how automated moderation already forces linguistic workarounds.
Media Ownership and Narrative Control
- Strong concern about consolidation: Ellison family (CBS, US TikTok), Musk (X), Meta, and others portrayed as an increasingly right‑wing media bloc aligned with the current administration.
- CBS/60 Minutes fights over the CECOT episode are cited as evidence of stories being delayed or softened, not fully killed—an example of subtle narrative management rather than outright blackout.
Technical Transition vs. Intentional Suppression
- A former TikTok US Data Services employee explains that the US instance relied on ByteDance’s global models and that, post‑divestment, the US side may be effectively rebuilding its own recommendation stack.
- Many users independently report a sudden, dramatic “preference wipe” and bizarre feeds around the sale date. Some see this as normal algorithm “exploration”; others say it feels qualitatively different and suspiciously timed.
- Several conclude it’s likely a messy combination of real migration problems and new political/keyword filters.
Civil Liberties, Political Tools, and Precedent
- Thread repeatedly zooms out to the long‑running expansion of executive power (surveillance, emergency authorities, now direct interference in specific businesses and platforms).
- Commenters warn that tools built to target “bad guys” or foreign adversaries inevitably get turned inward—drawing analogies to ICE killings, militarized policing, and future potential use of the same apparatus by the opposite party.
Alternatives and Structural Fixes
- Some advocate abandoning corporate social media entirely or moving to federated/P2P systems with open ranking algorithms, though others point out federation’s historic usability and moderation challenges.
- There’s broad pessimism that any large, ad‑driven, centralized platform will remain a neutral forum; the underlying incentive is always to “manufacture consent” for whoever holds the levers.