The AI Investment Boom
AI coding tools: useful autocomplete vs. unreliable partner
- Experiences diverge sharply. Some find Copilot/Claude “barely better autocomplete” or harmful on complex tasks, injecting subtle bugs and wasting debugging time.
- Others report 10x speedups on small greenfield projects or boilerplate (CRUD, serializers, tests, config), plus good explanations of unfamiliar APIs or code.
- Consensus: tools work best for:
- Simple, common patterns with abundant training data.
- Local, testable changes or small projects.
- They struggle with:
- Large, indirection-heavy codebases, deep call chains, generics, inheritance, DI.
- Obscure or poorly documented APIs/SDKs, or “hidden knowledge” that requires experimentation.
Hallucinations, trust, and liability
- Many see hallucinations as a showstopper for production use, especially in support, enterprise answers, or external-facing tools.
- Others argue:
- Humans also “confabulate”; LLMs can still be valuable where outputs are verifiable or stakes are low.
- Retrieval-augmented generation and better models reduce hallucinations, but do not eliminate them.
- Businesses are wary because AI vendors avoid liability; human employees remain legally accountable.
Hardware-heavy boom vs. software value
- Several note the article fixates on GPUs and data centers, likening this to selling “shovels in a gold rush.”
- Debate over whether this is an “AI boom” or a “GPU/datacenter boom”:
- Some expect future value mostly in software and applications, as with past hardware waves.
- Others worry AI commoditizes software, shrinking long-term software moats and shifting value to incumbents with non-tech moats.
Bubble dynamics and infrastructure analogies
- Many compare this to railroads, dot-com fiber, or PoW mining:
- Overbuild → crash → long-run benefit from excess infra (fiber, power, factories, data centers).
- Skeptics counter that GPUs age quickly; unlike railroads, compute hardware may be e‑waste in a decade. The lasting assets would be power, buildings, and grid upgrades, not the chips.
- Timing of a bust is disputed: “we’re early, like 1995” vs. “near 1999; crash within 1–2 years.”
Energy demand and nuclear/renewables
- Huge projected datacenter loads drive:
- Interest in nuclear (including SMRs/TRISO) and big renewable/grid buildouts.
- Fears of higher consumer electricity prices and misallocated capital.
- Some see this as an ironic but positive catalyst for non‑carbon energy; others stress we should reduce energy use instead of inventing new high-demand use cases.
Broader impacts and “AI everywhere”
- Concerns about:
- Jobless AI boom: heavy capex, weak tech job growth.
- “AI-powered” features added for hype (e.g., trivial story generators, appliances with pointless AI).
- Flood of low-quality “AI slop” content, degraded search, and worsening customer support via bots.
- Supporters emphasize real gains: internal tools (e.g., enterprise search across Slack/docs/code), faster learning, documentation/summarization, and enabling small “useful but would-never-be-built” tools.