AI's Dial-Up Era
Infrastructure & Bubble Comparisons
- Several commenters contest the “railroad” and “dot‑com fiber” analogies: railroads and fiber had very long-lived physical value, whereas GPUs depreciate in a few years and data centers age quickly.
- Others counter that a lot of current spend is on durable assets: buildings, power and cooling systems, undersea cables, and possibly new power plants. Even if the GPU layer is scrapped, power and connectivity could remain useful.
- A competing analogy is “canal mania”: huge investment in an ultimately transitional technology, soon bypassed by something more native (specialized AI hardware instead of GPUs).
Economics, Depreciation & Bubble Risk
- Strong concern that this bubble is worse than past ones because the main asset (GPUs) wears out or becomes obsolete before many players can reach profitability.
- Discussion of quantitative “bubble gauges” (capex/revenue, GDP share, multiples, funding quality) and macro indicators like the Buffett indicator; some think AI is still demand-led, others see classic overinvestment and circular funding.
- Some argue we’re replaying a gold‑rush dynamic: the tech can be real and valuable while the financial layer is wildly overextended.
Capabilities, Usefulness & Limits of Current AI
- Experiences diverge sharply: some find LLMs transformative for debugging, refactoring, documentation, research, and even complex planning (e.g., solar installations); others report frequent errors and hallucinations that nullify any time savings.
- Skeptics argue current LLM architectures are near their useful ceiling and suffer from inherent probabilistic behavior; they doubt this path leads to AGI without a fundamentally new approach.
- Supporters respond that diminishing returns don’t mean saturation, and that breakthroughs (new architectures, training schemes, linear attention, better feedback loops) could reset the curve.
Centralization vs Personal/Local AI
- Many see today as a “mainframe era”: a few hyperscalers rent access to giant models; most users act as thin clients despite having powerful local hardware.
- Others point to growing local‑model ecosystems (Ollama, LM Studio, on‑device models from Apple/Google) but note technical and usability barriers for mainstream users.
- Debate over whether true “personal computing for AI” will ever dominate, or whether economics and subscriptions will keep most capability centralized.
Labour, Society & Ethics
- Concerns that AI, like physical automation (“the claw” garbage trucks), will funge labor into capital, with immediate winners among owners and little safety net for displaced workers.
- Some argue AI will eliminate specific tasks, not whole professions, and will roll through jobs unevenly; others predict permanent job loss in areas like CRUD development and grading.
- Several comments lament massive AI capex versus alternative uses, especially climate mitigation, and criticize training on scraped web data as unsustainable and exploitative.
On the Dial‑Up Analogy & Historical Parallels
- Some think the “dial‑up” framing presumes the conclusion: it implies today’s janky, expensive AI is an early stage of an inevitable revolution, which is not yet proven.
- People recall that 1990s internet optimism was already very high; others remember skepticism. There’s debate over how obvious the internet’s eventual impact really was.
- Alternative frames: this could be AI’s “VR/fusion/FTL” era (big promises that stall), or simply another hype cycle on the Gartner curve whose long‑term slope is still unclear.