Launch HN: Onyx (YC W24) – Open-source chat UI
Licensing and “Open Source” Debate
- Significant discussion over whether Onyx is truly open source or “source-available.”
- Core chat, RAG, research, and SSO code is MIT-licensed; an
ee(enterprise) subdirectory is proprietary, and there is a fully MIT “FOSS” repo. - Some argue MIT core + paid enterprise features is standard open core and clearly OSI-compliant; others see it as “fauxpen source” and worry about future rug-pulls and VC pressure.
- Confusion stems from mixed licensing in one repo and references to subscription licenses; some want stricter separation or more transparency.
Product Positioning and Differentiation
- Critics question why this was funded given many similar projects (OpenWebUI, LibreChat, AnythingLLM, Vercel’s tooling, etc.) and limited moat.
- Supporters and the team emphasize:
- Strong RAG and connector suite (~40+ connectors, community contributions).
- Deep research and multi-step tool/agent flows, not just “chat + single tool call.”
- Enterprise features: SSO, RBAC, analytics, white-labeling, BYOK, multi-model support.
- Compared to competitors, Onyx is pitched as more stable, better-documented, and more enterprise-ready than some popular UIs.
UX: Simplicity vs Power Users
- Some praise a clean, non-intimidating chat UI for enterprise users who just want “a window to AI.”
- Others argue chat is poor UX for many workflows and lament loss of fine-grained controls seen in tools like SillyTavern, ComfyUI, etc.
- Onyx claims to aim for simple defaults with power features (code interpreter, RAG, deep research) and plans to reintroduce more configurability.
Maturity, Deployment, and Performance Concerns
- One user reports “unbaked” admin and document/RAG workflows: hard to track ingested content, regroup documents, and inspect references.
- Resource footprint and deployment complexity draw criticism (many containers, vector DB, high RAM/CPU requirements); some want a minimal, low-resource mode.
Enterprise Use Cases and Competition
- Seen as promising for regulated or air‑gapped environments where ChatGPT/Copilot are hard to deploy or too expensive per seat.
- Value propositions: model flexibility (no lock-in), richer connectors than model vendors, and ability to fork/customize.
- Others question longevity of a horizontal “one chat to rule them all” vs specialized vertical AI tools.
Feature Requests and Future Directions
- Requests include: mobile and desktop apps, better chat history search and organization, multimodal document handling, voice mode, scheduled actions, better local-model support, and lighter installs.
- Extensibility via connectors/tools/agents is seen as critical; some want tighter integration with frameworks like LangChain.