Wikipedia says traffic is falling due to AI search summaries and social video
Is declining traffic harmful?
- One side argues a 501(c)(3) without ads shouldn’t need perpetual growth; lower traffic might just lower hosting costs.
- Others counter that traffic is core to Wikipedia’s model: it drives donations (esp. banner campaigns) and is how readers become editors. Fewer visits mean fewer contributors and less money.
- Several note that AI and rich search answers now intermediate Wikipedia, so users benefit from its content without ever visiting or seeing appeals.
Revenue, costs, and “war chest”
- Commenters dig into annual reports: hosting is
2–4% of expenses; salaries and benefits dominate ($100M of ~$178M). - Critics see this as bloat for a site whose content is volunteer-written, likening WMF to other “mission-creep” nonprofits and calling the fundraising banners misleading given large reserves and an endowment.
- Defenders say a global, high-availability platform plus engineering, legal, fundraising, community support, and data-center staff reasonably explains the headcount. They argue you can’t equate “salaries” with waste without examining specific programs.
- There’s debate over spending on travel, grants, and “movement support” vs. simply running Wikipedia and investing for long-term survival.
AI scraping, usage shifts, and search intermediation
- Some claim AI scrapers are “hugging Wikipedia to death”; others point to the tiny hosting budget share and say bot traffic is not crushing the servers.
- Technically minded commenters note dumps exist but are hard to parse (wikitext, templates, Lua), so generic HTML scrapers are easier, causing unnecessary load.
- Many report personal usage shifting: LLMs now answer most queries that once led to Wikipedia, with Wikipedia still used for deeper reference (tables, lists, math, filmographies).
- Search AI overviews dramatically cut click-through to all sites, including Wikipedia, which undermines the “open web” and pushes value capture to large platforms.
Bias, governance, and contributor experience
- Multiple stories describe hostile or politicized editing cultures, “power editors” with conflicts of interest, and opaque or exhausting policy fights, especially on contentious topics.
- Others say most non-controversial, non-political edits go through smoothly and that strict sourcing rules are necessary to keep quality high.
- There’s recurring concern that experts and good-faith newcomers are driven away by bureaucracy, leaving more ideological or entrenched editors.
AI vs. Wikipedia’s role in the knowledge ecosystem
- Some predict LLMs will eventually outcompete Wikipedia as a summarizer of secondary sources; others insist LLMs remain unreliable, opaque, and parasitic on human-created reference works.
- Many argue Wikipedia (and similar projects) are essential “ground truth” for both humans and AI, and that AI companies should significantly fund or be taxed to support the commons they train on.
- A few envision AI agents helping maintain Wikipedia (e.g., cross-language consistency checks), with humans reviewing AI-suggested edits.
Social video and generational change
- The article’s claim that “social video” hurts traffic is met with mixed reactions.
- Some say TikTok and YouTube are now primary search/knowledge tools for younger users; others insist they’re mainly entertainment, though examples are given of people using TikTok as a “go-to information source.”
- This trend is seen as diverting both attention and potential future editors away from text-centric projects like Wikipedia.