MCP is eating the world

Perceived value vs. over‑marketing

  • Many see MCP as over‑hyped “just tool calling over JSON/HTTP,” not a new capability, but a standardization layer.
  • Others argue that standardization is exactly the value: it solves an N×M integration problem and lets shrink‑wrapped agents (Claude, IDEs, etc.) use arbitrary tools without bespoke wiring.
  • Some commenters are exhausted by influencer‑driven hype but still see MCP as a meaningful step in making LLM‑driven tooling practical and composable.

Comparison to existing approaches (REST, CLI, custom agents)

  • Repeated point: everything MCP does could be done with REST, OpenAPI, GraphQL, or CLI tools plus ordinary function/tool calling.
  • Supporters counter that:
    • MCP gives a uniform, agent‑native interface with descriptions optimized for LLMs rather than humans.
    • It lets you plug tools into third‑party agents you don’t control, without building your own loop.
  • Several people prefer CLI-based tools with LLMs that can already call shell commands; for them MCP feels redundant, especially in local/dev workflows.

Developer & user experience

  • Some find MCP too early and fragile: many servers are “vibe coded,” alpha‑quality, or unreliable; it’s hard to predict when models will invoke tools, and behavior differs across models.
  • Others say building a server is surprisingly simple with wrappers/SDKs; 3–5 well‑documented tools can already power useful internal agents (e.g., Jira/Snowflake summarization, custom workflows).
  • Non‑technical users benefit from being able to “click to add tools” in a chat UI rather than installing CLIs or writing integration code.

Security, privacy, and resource concerns

  • Strong criticism around security: under‑specified controls, prompt injection, OAuth token exposure, data leakage, and difficulty auditing what an agent + MCP stack is doing.
  • Using third‑party MCP servers is called a “privacy nightmare” for sensitive data; some advocate local‑only servers and tight sandboxing.
  • There is debate over resource waste: per‑tool Docker containers vs. monoliths; some see container overhead as negligible, others (especially on macOS) experience slow starts.

Maturity, ecosystem, and future

  • Ecosystem seen as immature: broken servers, weak security practices, no clear distribution model for consumers yet.
  • Some expect MCP (or something similar) to become the de facto “JDBC for LLMs,” especially for enterprise workflows; skeptics predict stagnation once hype fades or better standards emerge.