Stanford report highlights growing disconnect between AI insiders and everyone
Disconnect Between Tech & Everyone Else
- Many see startup culture shifting from “make something people want” to “make something investors want,” with AI pushed top‑down regardless of demand.
- Commenters argue current AI leadership prioritizes hype, valuations, and control over social needs, fueling distrust.
Generational Attitudes & Backlash
- Several report strong Gen Z hostility: AI is seen as cheating, low‑quality “slop,” and a threat to their already-precarious future (housing, jobs, climate).
- Some note older adults are more receptive, happily consuming AI content, while kids and teens mock AI outputs.
- There’s broader bitterness about intergenerational “ladder‑pulling” and AI as the next way to squeeze the middle class.
Education & Campus Climate
- Anecdotes of non‑CS “AI‑adjacent” courses under‑enrolling, possibly due to backlash and perceived uselessness.
- In contrast, core CS AI courses remain oversubscribed and selective.
Workplace Experience: Hype vs Reality
- Many engineers report being underwhelmed: tools help with boilerplate but often hallucinate, lie subtly, or degrade code quality.
- AI is heavily promoted by executives and ML teams; some organizations now track token usage and AI adoption as performance metrics.
- Others share contrary experiences: with good prompting, LLMs can “one‑shot” tasks that would have taken weeks, especially for experienced users.
Jobs, Layoffs & Inequality
- Strong concern that AI‑driven productivity will resemble post‑1980 trends: gains go to shareholders, not workers.
- Layoffs are widely attributed (at least rhetorically) to AI; many suspect it’s often a scapegoat for cost‑cutting.
- Junior engineers/interns appear disproportionately squeezed; companies prefer fewer seniors plus AI, risking future talent pipelines.
- Commenters debate whether AI is truly replacing jobs or whether executives are acting on hype and “vibes.”
Capabilities, Limits & Use Cases
- Mixed views: good at translation, grammar, log triage, document retrieval, and some medical pattern‑finding; weak at reliability, deep reasoning, and specialized domains.
- “Vibe coding” and AI‑generated content are seen as generating tech debt and low‑quality output that will eventually “blow up.”
Governance, Safety & Rollout
- Many argue the real “alignment problem” is aligning companies with society, not models with companies.
- There’s low trust in government regulation (especially in the US) and frustration at a rushed, poorly explained rollout that anthropomorphizes models while providing little public education.
- Some favor open, local models and utility‑style regulation of data centers; others worry open models still enable spam, scams, and creative theft.