Show HN: Mastra – Open-source JS agent framework, by the developers of Gatsby

Evals, prompts, and observability

  • Team suggests: prototype for a couple weeks, then spend a few hours writing evals, treating them like performance monitoring (synthetic + “real user” style).
  • Some wonder if evals/observability will move into model providers vs orchestration frameworks; Mastra team thinks major providers may avoid strong opinions here.
  • Prompt portability across LLMs is noted as fragile; Mastra has an “agent in local dev” to help improve prompts, but no automated cross-model prompt tuning yet.

TypeScript-first positioning & ecosystem fit

  • Many are excited that Mastra is TS-first with a clear, explicit API, integrating with Vercel’s AI SDK for model routing (including local/Ollama-style endpoints).
  • Others point out that TS/JS agent frameworks already exist (LangChain JS, Vellum, TypedAI, agentic, etc.), questioning the claim that this was “missing.”
  • Some users report positive experiences switching from LangChain to Mastra; others had bad experiences with the AI SDK itself.

Agents, workflows, and features

  • Mastra supports agents, workflows, agent memory, MCP tools (stdio and upcoming SSE), voice agents via multiple TTS providers, and automatic HTTP endpoints for agents/workflows.
  • There is interest in voice-to-voice / realtime-style models and WebSocket support; these are not clearly supported yet.
  • Memory is compared with LangMem and Zep; the hard part is seen as cleanly integrating storage/vector DBs.
  • Users experiment with MCP proxies and tool libraries; many conclude most third‑party MCP servers are thin, low‑quality wrappers and prefer owning their own tools.

Debating what “agents” are good for

  • Several commenters don’t “get” agents and ask why multiple calls/“personalities” are needed vs one strong LLM call.
  • Others explain agents as:
    • Decomposition into smaller steps to combat long-context degradation.
    • Job/workflow orchestration with real-world interactions (web, APIs, code execution).
    • Modularity and specialization (architect vs editor, experts vs generalists).
  • A common reframing: think “steps” or “AI workflow orchestration,” not anthropomorphic “agents.”

Language, runtime, and framework skepticism

  • Some argue JS/TS is suboptimal for agents vs Elixir/Erlang-style runtimes with stronger concurrency and state modeling; others counter that most agent workloads are I/O-bound, so JS’s async model is fine and TS DX is valuable.
  • There’s broader skepticism that agent frameworks add much beyond basic control flow and glue; several people prefer minimal helpers or roll‑your‑own designs. Others explicitly say they like frameworks and appreciate Mastra’s abstractions.

Licensing, lock-in, and business model

  • Strong pushback on calling Mastra “open source” while using Elastic v2; critics say this is misleading since it forbids offering Mastra as a hosted/managed service.
  • Mastra’s rationale: allow almost any user behavior but block cloud giants from reselling it.
  • Some worry about “lock-in” via the Vercel AI SDK; others respond that it’s just an MIT OSS library, similar to any other dependency.
  • Pricing is currently unclear; a hosted cloud platform is in beta and appears to be the monetization path.

Gatsby legacy and trust

  • The “by the developers of Gatsby” tagline draws mixed reactions.
  • Some praise the team’s past framework experience; others recall Gatsby as painful or overpromising and see the association as a negative or a sign of future “abandonware.”

API design and ergonomics feedback

  • The fluent .step().then().after().then().commit() workflow DSL is criticized as awkward and hard to read for branching graphs; suggestions include nested structures or explicit dependency arrays.
  • Mastra devs are receptive and mention tickets to support more explicit edge definitions.