Stack Overflow is almost dead
Role of AI vs Other Decline Factors
- Many say LLMs “ate SO’s lunch”: they now ask ChatGPT/Perplexity instead of visiting or posting.
- Others argue the decline started ~2014, long before modern LLMs, due to better docs, smarter tools, GitHub issues, official vendor forums, YouTube, and tutorials.
- Some claim SO simply “answered most common questions,” so new-question volume naturally fell. Critics counter that much content is outdated and it’s hard to see what’s still valid.
Licensing, Copyright, and AI Training
- Discussion notes SO content is under Creative Commons, but there’s debate whether AI companies respect attribution/obligations.
- Several commenters share anecdotes of LLMs reproducing SO posts or comments verbatim, suggesting more than abstract “learning.”
- Others argue such snippets are de minimis legally and that CC applies to presentation, not facts.
Moderation, Culture, and “Toxicity”
- A major thread is hostility toward SO’s aggressive closing, downvoting, and editing culture, especially from ~2014 onward.
- Many describe good, novel questions being closed as duplicates or “off-topic,” discouraging participation and pushing people to Reddit/Discord.
- Defenders argue SO was never meant as a helpdesk or chat forum but as a tightly curated, searchable knowledge base; strict closure and anti-chitchat policies are seen as essential, not “power trips.”
- There’s deep disagreement over whether this gatekeeping preserved quality or killed the community.
Duplicates, Completeness, and Site Purpose
- Curators emphasize that duplicates are linked, not forbidden, and that merging into a single canonical Q&A improves search and avoids repeated low-value answers.
- Critics say “duplicate of vaguely related question with different context” became common, making SO feel hostile and useless for real, current problems.
Future of LLMs and Knowledge Sources
- Several worry that if SO and similar sites atrophy, LLMs will lack fresh, vetted training data for new languages/frameworks, leading to self-cannibalizing, lower-quality answers.
- Others think future models can learn more directly from code, docs, and repositories, or from new Q&A platforms.
- Some foresee SO (or successors) becoming primarily a structured data source for LLM training, which others view as a dystopian “humans-labeling-for-AI” future.
Business, Infrastructure, and Alternatives
- Commenters note SO’s question volume is down to ~2009 levels but still far from “zero”; traffic might remain high enough for it to function as a static reference.
- Private equity ownership, attempts to bolt on AI products against community consensus, and the sale to an LLM vendor are seen as signs of strategic drift.
- Many now rely on GitHub issues, project Discords/Slacks, and official forums, though these are fragmented and often not search-indexed.