A hackable AI assistant using a single SQLite table and a handful of cron jobs

Overall Reaction & Design Approach

  • Strong enthusiasm for the project’s pragmatism: a single SQLite table, cron jobs, and direct API calls instead of vector DBs or heavy agent frameworks.
  • Many see it as a great “weekend-hack” template and a realistic pattern for personal AI tools.
  • Some find the retro-butler UI charming; others see the verbosity as exactly what they don’t want from assistants.

Email as an Interface for AI Assistants

  • Several commenters independently converge on “email as the perfect UI” for AI coworkers:
    • Universal, asynchronous, text + attachments, works with existing tools like Outlook/Gmail.
    • Good fit for slow “research” tasks, status updates, journaling, receipt parsing, and simple CMS-like systems.
  • Examples:
    • Daily journaling by replying to an automated email that is POSTed into a DB.
    • Agents parsing templated or JSON email bodies; services like Mailgun/CloudMailin to turn email into webhooks.
    • Gmail + Pub/Sub hooks for instant automation, including LLM-based tagging and SMS/phone alerts.
  • Counterpoint: for purely service-to-service communication under full control, protocols like MQTT/ntfy are seen as simpler and more robust than email.

Transport & Integration Choices

  • Discussion of using Telegram vs Slack/Discord; Telegram is seen as low-friction for bots and mobile access, though concerns are raised about its default lack of E2E encryption.
  • People list alternative channels (Telegram bots, MQTT, ntfy, Twilio, smartphone UIs, Raspberry Pi touchscreens).
  • Some are building email- or Telegram-based “AI butlers” that run commands, manage tasks, parse receipts, or orchestrate Notion/Todoist.

LLM Cost, Capability & Context Handling

  • Multiple comments emphasize how cheap hosted LLMs are now (fractions of a cent per prompt) and how small the daily-briefing prompt actually is.
  • Others still prefer local models via tools like Ollama for privacy, noting that 1.5B–3B parameter models are a practical minimum for reliability.
  • Strategies discussed for avoiding context-window bloat:
    • Date-stamped “memories” so only relevant near-term items go into the prompt.
    • Periodic summarization/compression of older context, with a DB as long-term memory and possible vector/FTS search (including SQLite extensions).

Privacy, Security & Trust

  • Significant concern about sending personal/family data to commercial LLM APIs and over insecure channels.
  • Some argue cloud providers’ “we don’t train on your data” promises are acceptable; others are deeply skeptical.
  • Suggested mitigations include using cloud inference behind a cloud provider’s privacy boundaries or running smaller models locally.
  • Security risks of agents with access to email/commands are noted (prompt injection, data exfiltration, unsafe command execution).

Usefulness vs Overcomplication

  • One camp questions whether this truly simplifies life versus just centralizing what a calendar already does.
  • Others say the value is in aggregating many small data sources (family calendars, mail, weather, deliveries) into one coherent, personalized daily brief.
  • Several emphasize that even if niche or bespoke, these “personal software” tools can be life-changing for their individual creators.

Big-Tech Assistants & Missed Opportunities

  • Repeated criticism of Siri (and to a lesser extent Google’s assistant) for poor reliability and trivial features compared to what a lone hacker can do.
  • Some argue large companies are constrained by monetization, privacy risk, internal coordination, and product bets, leaving a gap for personal/OSS assistants.

Ecosystem & Future Directions

  • Interest in an open-source, extensible “family assistant” framework with pluggable integrations (calendar, email, home automation, etc.), possibly powered by MCP or similar plugin systems.
  • Several share or reference related DIY setups using Apple Shortcuts, Home Assistant, n8n, SQLite+vector extensions, and multi-LLM routing.