AI won't use as much electricity as we are told (2024)

Cement analogy and sustainability framing

  • Several commenters dispute using cement as an example of “ignored” emissions: concrete is widely recognized as a major problem and an active target for reduction.
  • Others argue modern infrastructure depends on concrete in a way AI does not, so electricity for AI should be scrutinized more, not less.
  • The article’s suggestion that the 20th‑century industrial economy was “sustainable” is challenged as sidestepping degrowth critiques.

Growth curves, rebounds, and IT’s energy share

  • Many accept that naive “infinite hockey-stick” projections are usually wrong, but note that demand typically grows until constrained.
  • The rebound effect / Jevons paradox is repeatedly raised: efficiency gains (5G vs 4G, faster chips, better models) can expand usage so much that total energy still rises.
  • Counterpoint: some digital uses substitute for more energy‑intensive activities (travel, paper, physical logistics), potentially reducing net energy even if IT use rises.

Is AI different from previous IT waves?

  • Skeptics argue AI looks more like Bitcoin mining: ever‑increasing difficulty, escalating model sizes, and GPU requirements that push toward extreme energy use.
  • Others emphasize efficiency trends: cheaper training (e.g. newer models, specialized hardware), performance‑per‑watt gains, and the expectation that optimization has barely begun.
  • There’s debate over whether Moore’s law–style improvements will continue strongly enough to offset rising demand.

Current evidence: power deals, grids, and water

  • Multiple comments point to concrete signals: gigawatt‑scale nuclear and power contracts, grid constraints (e.g. Ireland, US regions), and firms saying they are “power‑blocked.”
  • Critics say this undermines the article’s analogy to earlier overblown IT predictions; this time billions are already being spent on new generation.
  • Concern is raised about local impacts: higher regional prices, siting in cheap/dirty grids, fossil‑fueled turbines near communities, and significant water use for cooling (per‑query water estimates cited).

How big is a single AI query?

  • One side cites figures like ~0.24–0.3 Wh per LLM prompt, arguing per‑use energy is comparable to an old Google search and small vs everyday activities.
  • Others question vendor‑provided numbers as unverified and stress that aggregate demand (and fossil generation) is what matters; they call for transparent methodologies and Wh‑per‑prompt accounting.
  • Some note that even if relative IT share stays near 1–2% of global electricity, absolute consumption can still be large because total demand is rising.

Crypto as comparison

  • The article’s claim that ending crypto mining could offset AI’s rise is challenged: proof‑of‑stake chains use drastically less energy than proof‑of‑work, possibly comparable to card networks.
  • Others maintain that crypto remains a clear example of large, mostly wasteful energy use, unlike AI, which at least has broad potential applications.

Software inefficiency and dematerialization

  • Several point out massive software inefficiency (layers of abstractions, Excel+SQL+COBOL, Python HFT bots), arguing we have not reduced waste in computing, only hardware wattage per unit work.
  • Others argue technology since the 1950s has dematerialized the economy (fewer trips, fewer physical goods), and AI could accelerate this, potentially being net‑saving in energy.

Uncertainty, pace, and social questions

  • Some note the article is a year old and the space is moving fast; new evidence (grid stress, rising prices, nuclear buildouts) may weaken its confidence.
  • Multiple commenters stress that we lack solid long‑term data on AI workloads and diurnal demand curves; armchair reasoning is unreliable.
  • Beyond electricity, people worry about economic and social stability if AI really does deliver massive white‑collar automation: who earns, who buys, and what happens when many lose income.