DRAM pricing is killing the hobbyist SBC market

Impact on SBC Pricing and Use

  • Commenters report steep price hikes on higher-RAM SBCs (e.g., 8–16GB Raspberry Pi 5 and Compute Modules), in some cases nearing or exceeding cheap used laptops/mini PCs.
  • Many say that at current prices SBCs lose their “cheap tiny computer” appeal and are no longer obvious starter hardware for hobbyists.
  • Some argue “killing the market” is overstated: low‑RAM models (Zero 2 W, Pi 3 series, 2–4GB boards) remain relatively affordable, and past spikes (e.g., COVID) eventually eased.
  • Others stress that for low‑income and non‑US users, these price shifts are a real barrier to entry.

Shift to Microcontrollers and Used x86 Hardware

  • Rising SBC costs push people toward:
    • Microcontrollers (RP2040/Pi Pico, ESP32, Arduino, etc.), especially now that MicroPython/JS and UF2 flashing make them easier.
    • Used corporate SFF/mini PCs and laptops, which often offer 8–32GB RAM and SSDs for similar or less money, at the cost of higher power draw and no native GPIO.
  • Debate: some see SBCs as overkill for LED‑blinking‑type tasks; others value full Linux, TCP/IP, and familiar tooling as a learning bridge into embedded.

DRAM, Storage, and AI Demand

  • Multiple anecdotes of DRAM, SSD, HDD, and SD card prices jumping 3–10x, with server/workstation quotes 2–3x or more versus a few years ago.
  • Many attribute this primarily to AI datacenter demand and a shift of DRAM makers toward high‑margin HBM; others also mention fuel, helium supply, and geopolitics.
  • There is controversy over alleged “collusion” and reports of very large DRAM allocations to major AI firms; some see ordinary market response to big customers, others note past DRAM price‑fixing and risks of over‑reliance on one buyer.
  • Some expect new fabs and Chinese entrants to ease prices in a few years; others fear a longer‑term “new normal.”

Broader Consequences and Software Efficiency

  • Analysts cited predict that elevated memory costs will hit low‑ and mid‑range smartphones hardest, potentially halving volumes; hobbyist SBCs and budget phones are portrayed as “collateral damage” of the AI boom.
  • Many commenters lament software bloat and hope high RAM prices will force better memory discipline; others say orgs don’t care enough to prioritize this.
  • LLMs are seen both as part of the problem (driving demand) and as a possible tool to help write more efficient code, especially in languages with strong safety/type systems; there is significant skepticism that “just ask AI to optimize it” will work reliably.