Helldivers 2 devs slash install size from 154GB to 23GB

What Changed and Why the Game Was So Big

  • Developers originally duplicated common assets across many per-level archives to optimize HDD seek times, a long-standing console-era technique similar to CD packing.
  • On console builds they assumed SSDs and didn’t duplicate, so console install sizes were always much smaller; PC builds kept the HDD-optimized layout.
  • The decision was based on “industry data” suggesting up to ~5× worse HDD load times without duplication; they conservatively doubled that estimate instead of profiling their own game.
  • Later measurements showed Helldivers 2’s loading is dominated by CPU-side level generation running in parallel with asset loading, so deduplication only adds a few seconds on HDD in worst cases.

Reactions to the Size Reduction (154 GB → 23 GB)

  • Many see 23 GB as very small for a modern, visually rich, content-heavy title, given 60–100+ GB is common.
  • Others argue even 23 GB is substantial and highlights how far asset bloat (especially high-res textures and audio) has gone compared to older games.
  • Players on limited SSDs, Steam Deck–like devices, or consoles welcome the change because a few 100+ GB titles force constant “install juggling.”

HDD vs SSD, and Performance Tradeoffs

  • Some are surprised 11% of players still use mechanical drives; others note common setups: small SSD for OS + large HDD for games/media.
  • Discussion of how large contiguous archives reduce random I/O on HDDs and why this historically justified data duplication; others counter that modern SSDs and OS caching make this far less compelling.
  • Several commenters highlight that disk I/O is often not the real bottleneck; many games are CPU- or GPU-bound during loading.

Costs, Incentives, and Engineering Culture

  • Multiple comments note that studios and storefronts don’t directly pay for users’ SSD capacity, so there’s weak economic pressure to optimize install size.
  • Estimates suggest the wasted space collectively represented many millions of dollars in user hardware “cost,” contrasted with a likely much smaller engineering cost to fix.
  • Some see this as a textbook case of premature or vibe-based optimization; others defend the team as a small studio juggling engine limitations, live-service content, and more urgent bugs.