Extropic is building thermodynamic computing hardware

What the hardware is supposed to be

  • Commenters converge that this is not a general-purpose CPU/GPU replacement but specialized analog/stochastic hardware.
  • Core idea: massively parallel “p-bits” implementing Gibbs sampling / probabilistic bits, i.e. fast, low-energy sampling from complex distributions rather than simple uniform RNG.
  • One view: they’re essentially an analog simulator for Gibbs sampling / energy-based models, potentially useful for denoising steps in diffusion or older Bayesian/graphical model workloads.

Relationship to prior work and terminology

  • People note prior companies (e.g., other “thermodynamic computing” / stochastic hardware efforts) and say Extropic has already shifted from superconducting concepts to standard CMOS.
  • Several argue this is just stochastic or analog computing under new branding; “thermodynamic computing” is criticized as buzzwordy and potentially misleading.
  • Others say the underlying ideas are decades old (stochastic/analog computers, probabilistic programming), with the novelty largely in CMOS integration and scale of RNG/p-bits.

Claims, benchmarks, and real-world value

  • There is real hardware, an FPGA prototype, an ASIC prototype (XTR-0), a paper, and open-source code; some stress that this makes outright “vaporware” accusations unfair.
  • Skeptics counter that existence of hardware and a paper does not imply commercial relevance; benchmark examples (e.g., Fashion-MNIST) are seen as unimpressive and small-scale.
  • Questions raised:
    • Are the quoted 10×–100× speed/energy gains versus CPU/GPU meaningful at full-system level (Amdahl’s law)?
    • Why highlight FPGA comparisons instead of showing FPGA products or just doing a digital ASIC first?
    • Is random sampling actually a bottleneck in modern AI workloads? Many say no for today’s deep learning.

Fit with current AI paradigms

  • Multiple comments argue the stack appears optimized for 2000s-era Bayesian / graphical / energy-based methods, not for today’s large transformer models where matrix multiplies dominate.
  • Some speculate this could enable a “renaissance” of sampling-based methods; others think it’s too late and will stay niche unless model paradigms shift.

Hype, aesthetics, and skepticism

  • The website’s heavy visual flair, cryptic runes, and slow, CPU-hungry frontend strongly contribute to “hype/scam” vibes.
  • Opinions split: some see genuine, risky deep-tech experimentation; others see overblown marketing, vague claims, and unclear answers to basic practical questions (precision, verification, ecosystem, reproducibility).