Updates to our web search products and Programmable Search Engine capabilities

Change in Programmable Search & New Limits

  • Google is ending “search the entire web” for Programmable Search / Custom Search.
  • New engines are limited to ~50 domains; existing full-web engines must migrate by Jan 1, 2027.
  • Full-web access is being moved behind opaque “enterprise” offerings (Vertex AI Search, custom deals), with unclear pricing and access criteria.

Effect on Niche / Indie Search Engines

  • Many small/niche search sites, ISP homepages, kids’ search portals, privacy search engines, and LLM tools have been using Programmable Search as their backend.
  • Commenters expect this will “kill” or severely degrade general-purpose third‑party search built on Google’s index.
  • Some see this as part of a broader trend of Google closing off remaining open/low-friction surfaces (“another one to the Google Graveyard”).

Kagi, SERP APIs, and Scraping

  • Discussion centers on Kagi’s explanation that Google doesn’t offer a suitable paid web search API, forcing use of third‑party “SERP APIs” that scrape Google and resell results.
  • Disagreement over whether this is “stealing” vs. a reasonable response to a closed monopoly.
  • Google is already suing at least one such SERP provider; some expect more legal pressure.

Monopoly, Antitrust, and “Essential Facility”

  • Strong claims that Google search is a de facto monopoly and an “essential facility” that should be syndicateable on fair terms.
  • Complaints about Google “taxing” brands by selling ads on trademark searches; some argue regulators should ban this.
  • Others counter that Google owns its index and is not obligated to let competitors resell it.
  • Several comments tie this to ongoing US antitrust cases; some suspect the 50‑domain model is a legal workaround.

Building Independent Search Indexes

  • Multiple hobby and indie projects (e.g., 34M–1B+ document indexes) are discussed.
  • Consensus: crawling is “the easy part”; ranking and spam fighting are the real, hard work.
  • Techniques mentioned: PageRank-style link analysis, anchor text, behavioral signals, ad-network fingerprints, link-graph clustering.
  • Crawlers face blocking, rate limits, and robots.txt rules that often privilege Google/Bing over new entrants.

Alternatives to Google’s Index

  • Bing’s custom search / APIs are mentioned, but they’ve also been restricted or discontinued and are expensive.
  • Other independent or semi-independent indexes: Mojeek, Qwant/Ecosia’s new European index, Marginalia, YaCy.
  • Skepticism that new entrants can match Google’s breadth, especially for non‑English or niche-language search.
  • Some argue future search will be more vertical/specialized rather than full-web general search.

Impact on LLM Tools and AI Ecosystem

  • Programmable Search was widely used as a cheap/simple web tool for third‑party LLM frontends.
  • This change is seen as Alphabet closing “AI data leaks” and pushing everyone toward Gemini + Vertex-based grounding.
  • Expectation that some will respond with adversarial scraping rather than official APIs, raising legal and ethical stakes.

Platform Risk & “Don’t Build on Other People’s APIs”

  • The change is cited as a textbook example of why depending on a large platform’s API for your core value is dangerous.
  • Comparisons are drawn to Twitter’s API lock-down, Bing API changes, and other platform rug-pulls.
  • Advice: own your core infrastructure where possible; treat third‑party APIs as optional enhancements, not moats.

Wider Concerns About the Web and Search Quality

  • Many express frustration with modern Google Search (ads, SEO spam, reduced usefulness), and nostalgia for earlier, more “fun” and open web search.
  • Some argue the web itself has degraded (AI slop, walled gardens, SEO spam), making good search intrinsically harder.
  • Others see the clampdown as moving us toward a “private web” controlled by a few US tech giants, and call for stronger state or EU intervention and public/sovereign indexes.