OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems
Positioning of Major Players (Apple, Microsoft, Oracle)
- Some wonder where Apple is in this capex arms race; replies note Apple spends similar sums on buybacks and focuses on efficient on-device AI instead of massive datacenters.
- The Nvidia–OpenAI deal is read as OpenAI diversifying away from exclusive dependence on Microsoft/Azure; others note Microsoft already hedged with Anthropic and that OpenAI is also tied up with Oracle.
- Several comments see the whole ecosystem as increasingly incestuous: cloud vendors, model labs, and Nvidia all cross‑investing and reselling to each other.
What 10GW Actually Means
- Estimates for GPUs range from ~2–5 million accelerators depending on per‑GPU/system power (1–5 kW+) and cooling.
- Comparisons offered:
- Roughly the average power use of the Netherlands, or NYC+Chicago combined.
- About 40% of Bitcoin’s electrical draw, but a tiny fraction of its hash power due to ASICs.
- Equivalent to multiple large nuclear plants or over 100 nuclear submarines’ reactors.
- Many emphasize that at this scale, power (not chip count) is the binding constraint.
Why Use Power (GW) as the Metric?
- Datacenters are planned, permitted, and financed around megawatts/gigawatts, since compute per watt changes but grid capacity and cooling do not.
- Some view the GW framing as honest and alarming given fragile grids and rising residential prices; others see it as marketing theater to impress investors.
- There is confusion over whether 10GW is nameplate capacity vs typical utilization.
Nvidia’s “Up to $100B” Investment in OpenAI
- Interpreted as in‑kind or circular: Nvidia “invests” and OpenAI uses the money to buy Nvidia hardware.
- Many call this a form of vendor financing or “round tripping” that inflates revenue and valuations: Nvidia sells chips funded by its own capital and gets OpenAI equity back.
- Others argue it’s legitimate strategic investing: real hardware is built and used, and Nvidia simply trades margin today for equity in a key customer.
Bubble, Accounting, and Systemic Risk Concerns
- Large minority sees this as classic late‑stage bubble behavior, likening it to the 1990s telco boom and dot‑com era vendor financing.
- Story: debt and stock-fueled capex, circular deals, and valuations dependent on unrealistically high future AI profits.
- Counterarguments:
- Inference revenues are already large; some claim each major model has recouped its training cost.
- Even without AGI, AI is already deeply useful (coding, search, enterprise features), so massive investment may be rational.
- Disagreement over legality: some call it “Enron‑like”, others note it’s disclosed, equity-based, and thus unlikely to be prosecuted as fraud.
Grid, Bills, Water, and Climate
- Strong anxiety that AI datacenters will drive higher consumer electricity bills and grid upgrades paid by ratepayers, while hyperscalers secure favorable industrial pricing.
- Technical commenters note:
- 10GW requires huge new generation and transmission; power lines, transformers, and cooling are long‑lead bottlenecks.
- Co‑locating datacenters near generation (hydro, solar, nuclear, gas) and using waste heat (e.g. district heating) are possible mitigations but not trivial.
- Water use for cooling sparks debate: some consider it overblown, others highlight local drought impacts and water pollution around DCs.
Value vs Waste: What Are We Buying?
- Skeptics: 10GW likely yields marginal gains—better chatbots, influencers, ads—rather than cures for cancer; they see misallocation of capital that could go to medicine, clean energy, or education.
- Supporters:
- Argue compute is the new foundational infrastructure (like fiber in 2000), and AI will eventually underpin huge productivity gains.
- Emphasize that post‑bubble, society may inherit abundant compute and power infrastructure, as happened with dark fiber after the telco crash.
Future Overhang and Secondary Effects
- If AI demand disappoints, commenters expect:
- A glut of aging GPUs and overbuilt datacenters; possible crash in AI infra prices; pressure on power generators that expanded for AI.
- But also cheaper compute for research and other industries, similar to cheap broadband post‑dot‑com.
- Some foresee intensifying vertical integration: AI firms or their backers directly investing in new generation (including nuclear and large solar) and power trading to secure their own supply.