Google de-indexed Bear Blog and I don't know why
Google’s Power and Centralization
- Several comments relate the de-indexing story to broader concerns that Google effectively decides which businesses and voices survive online.
- Google Maps is cited as having displaced TripAdvisor and local review sites; some share personal experiences of Google wiping out competitors by absorbing their data.
- Others argue centralization is “efficient” due to network effects and user laziness, while critics say this is really just monopoly power disguised as efficiency.
Declining Search Quality and Opaque Indexing
- Many report random de-indexing or deep demotion of sites (blogs, shops, even very large sites) with no clear explanation from Search Console.
- Complaints include misclassified duplicate content, missing pages in specific regions, and inconsistent indexing between Google and Bing.
- Search results are described as increasingly polluted with spam, fake products, and auto-translated content (notably Reddit), with some saying Google has neglected search in favor of ads and AI.
Speculated Technical Causes of De‑indexing
- Hypotheses include: invalid RSS triggering hidden spam heuristics; canonical URL confusion; duplicate content via reverse proxies; sitemap structure/size issues; Unicode-heavy URLs; and odd 301/304 caching interactions.
- Some note Google’s recent change in how it counts impressions/clicks, suggesting methodological shifts may also impact visibility.
- Several point out that false positives are inevitable in large anti-spam systems, but the lack of diagnostics or support makes recovery guesswork.
Spam, Negative SEO, and Abuse Patterns
- One detailed case describes attackers using a site’s search page: spammy queries get echoed in H1/title, Google crawls those URLs, and the site is reclassified as scammy until search pages are noindexed.
- Commenters mention similar tricks (fake support numbers, reputation management “hacks”) and describe this as a form of negative SEO.
Alternatives: P2P, Law, and Coping Strategies
- Some call for a P2P, RSS-like, or webring-based discovery layer; others respond that such tech exists but lacks adoption.
- A strong thread argues this is fundamentally a political/antitrust problem that should be tackled by breaking up Google, while skeptics cite laws like the DMCA as evidence governments often worsen concentration.
- A few rely on mailing lists or other media and deliberately de-index themselves, but most acknowledge heavy dependence on Google and the fragility this creates.