Sam Altman Is Losing His Grip on Humanity

Resource priorities: humans vs. AI

  • Several comments argue the core issue isn’t brain energy minutiae but where society chooses to invest energy and resources: raising capable humans vs. scaling AI that could make many people economically superfluous.
  • One view: decisions are driven far more by power and control than by efficiency or human well‑being; machines are easier to control than people.
  • Others push back that this “everything is about control” framing is overly conspiratorial and prevents distinguishing between “bad” and “truly awful” uses of power.
  • A separate moral critique: treating human procreation in cost–benefit terms (“worthwhile,” “expensive”) mirrors the logic of slavery; people are ends, not assets.

Power, capital, and systemic critique

  • Some tie Altman-style grandiose claims to a broader over‑capitalized economy: too much money chasing too few productive outlets encourages bubbles, fraud, and fantastical narratives (crypto, NFTs, hoarding compute).
  • There’s disagreement on trickle‑down: one side says concentrated capital inevitably does “stupid” or harmful things; another insists markets still self‑adjust and investment can be productive.
  • Debate over elites: one side argues rich and powerful groups systematically act to increase their power, with higher sociopathy rates; another counters that ordinary people are just as capable of greed and malice, and outcomes are more chaotic than conspiratorial.

Assessments of Altman and OpenAI

  • Many comments are openly hostile: portraying him as a liar, grifter, authoritarian personality, or tech sociopath comparable to other high‑profile CEOs.
  • Some think his recent statements and odd partnerships look like a CEO “throwing everything at the wall” as costs and hype diverge.
  • Others emphasize structural incentives: choosing a monetization‑focused leader over a research‑focused one signals investors’ priorities, not just personal flaws.
  • On OpenAI’s business, there’s a split:
    • One camp sees a bubble: massive R&D burn, fragile moat, and likely acquisition or marginalization once big platforms roll their own models.
    • Another argues inference and subscriptions are already profitable; sunk GPU and datacenter investments will become a durable moat when model quality converges.

AI capability vs. human value

  • Some claim current AI is already more competent and useful than most people for computer‑based work and will soon dominate “verifiable” domains.
  • Others object that this ignores the broader scale and meaning of human life and enables leaders to prefer 70–80%‑correct AI over fallible but autonomous humans, even when that dehumanizes workers and decision‑making.

Energy, environment, and data centers

  • A detailed proposal suggests strict rules for AI datacenters: off‑grid, non‑fossil energy that eventually feeds surplus back to the grid, and no use of fresh water for cooling (only wastewater), with heavy penalties for violations.
  • Supporters see this as both climate‑aligned and innovation‑forcing; skeptics argue it would entrench only the largest cloud providers, distort siting of power and water infrastructure, or simply be bypassed via fossil generators and national‑security rhetoric.
  • Some question why data centers should be singled out when many other industries waste far more water or energy; others reply that DCs are at least technically amenable to closed‑loop designs.

“Train a human” and evaluation of the article

  • Multiple commenters argue that using “train a human” in context was an ordinary or even joking phrase, and building an entire critique around it is overreach.
  • Others say the joke is revealing: it fits a pattern of viewing humans through the same optimization lens as models, and so is fair game for scrutiny.
  • There’s also a broader complaint that the article contributes little beyond “X is bad” sentiment, offers no serious argument against materialist views of mind, and resembles a recurring “two minutes hate” cycle rather than substantive engagement.