Ollama's new app

Platform support & implementation

  • Announcement headline emphasizes macOS and Windows; many note absence of a Linux GUI despite Linux being a key platform for devs and servers.
  • Linux currently has CLI only; several users say that’s fine for “power users” but harms mainstream Linux adoption.
  • Some assumed Electron; others clarify it’s Tauri using the system webview, not Chromium. Debate over whether this can be called “native”.

Backend vs frontend focus & target audience

  • Long‑time users see Ollama primarily as a local LLM backend and are uneasy about effort going into a first‑party GUI instead of model/engine improvements.
  • Homepage shift away from CLI prominence is read by some as a pivot from developers to “regular users”; maintainers deny a pivot and say the GUI helps dogfood the backend.
  • Some welcome a simple official UI to onboard non‑technical friends and enterprise users; others call it “unnecessary” given existing frontends.

Comparisons with other tools

  • Many already use Open WebUI, LM Studio, Msty, Jan, AnythingLLM, LobeChat, etc., usually with Ollama or llama.cpp as backend.
  • Several claim Open WebUI is significantly more feature‑rich; LM Studio often cited as the best “all‑in‑one” GUI, especially on macOS.
  • Some suggest just building a custom UI using an OpenAI‑compatible API; others push back that time and complexity make that unrealistic for most.

Open source, licensing & trust

  • Users notice the new desktop app is closed source while the core remains open; some express disappointment and wish the UI were OSS.
  • Broader OSS drama: Open WebUI’s and LobeChat’s licenses are criticized as not OSI‑compatible despite “open source” branding.
  • Detailed critique from Reddit is relayed: alleged vendor lock‑in, non‑upstreamed llama.cpp tweaks, proprietary model handling, confusing model naming.
    Counterpoints argue these claims are exaggerated, note permissive licenses, and frame tradeoffs as usability vs raw performance.

Features, UX, and missing capabilities

  • Early reports praise the new app’s simplicity, multimodal and Markdown support, and treatment of “thinking” models.
  • Frequently requested:
    • Remote‑backend support (run UI on one machine, inference on another).
    • Shared local model storage across apps.
    • Tool‑calling/MCP, web search, richer integrations (GitHub, YouTube).
    • Clearer VRAM fit indicators, easier context‑window control.
  • Some fear the GUI focus may slow support for cutting‑edge models; maintainers say they prioritize major releases and support direct gguf pulls.