Skip to main content
Zenii logo

Zenii

MITRust 2024

A local AI backend for developers

20 megabytes. AI everywhere.

Install one binary. Every tool on your machine gets AI memory, a shared brain, and a local API. Private. Fast. Extensible.

ChatGPT is a tab you open. Zenii is a capability your machine gains.

6 AI providers — or bring your OpenAI API compatible provider
Your data never leaves your machine
20 MB. Powered by Rust. MIT licensed.
Works with MCP clients:Claude Desktop ·Claude Code ·Cursor ·Cline ·Continue ·Windsurf ·Codex
<20 MB
binary with GUI
133
API routes
16
built-in tools
6+
AI providers
110+
config fields
0
telemetry
MIT
licensed

Why Zenii?

The Problem

You've tried the alternatives. Open Interpreter for code execution. Khoj for document search. Gemini CLI for your terminal. OpenClaw for chat integrations. But none of them is an API server. None gives you 133 routes your scripts can call. None has a native desktop app AND a plugin system in any language AND persistent vector memory AND cron scheduling — all in 20 MB.

No tool that is simultaneously:

  • An API server your curl can call (not just a chat interface)
  • A native desktop app (not Electron, not a browser tab)
  • Extensible in any language (not locked to one ecosystem)
  • An AI that remembers and learns (not session-by-session amnesia)

Zenii fills that gap. 20 MB. MIT licensed. Yours.

Built for real desktop AI work

Not a chatbot. An API server.

133 routes. curl http://localhost:18981. Your scripts, cron jobs, and browser extensions all get AI — no SDK required. Interactive docs at /api-docs.

A real desktop app. Not Electron.

Tauri 2 + Svelte 5. Under 20 MB binary. A native app that respects your machine.

Write plugins in Python, Go, JS — or anything.

JSON-RPC 2.0 over stdio. Any language that reads stdin and writes stdout works. Plugins are first-class citizens. A plugin is ~15 lines.

It remembers. Across sessions, across restarts.

SQLite FTS5 + vector search. Conversations, tool results, and context survive restarts. Your AI gets better over time. Ask it about last month.

Gets smarter over time. Asks before changing.

Self-evolving agent capabilities with human-approved proposals. The AI learns your preferences and grows its skills — with your permission.

Security is architecture, not a checkbox.

6 layers active by default: OS keyring, autonomy controls, filesystem sandboxing, injection detection, rate limiting, and full audit trail.

Every tool you use today is an island. Zenii is where they all converge.

Every tool you use today is an island. Zenii is where they all converge.

AI Providers

OpenAI
Claude
Gemini
Mistral
Ollama

Messaging

Telegram
Slack
Discord

Dev Tools

Python
Go
Terminal
curlcurl
Cron
Plugins
ZeniiZenii

Write a memory from Telegram. Recall it from Python. Schedule a task from the CLI. Get notified on Discord. Everything shares the same brain.

One memory. Four interfaces. Zero configuration.

9:00 AM·Desktop app

"Production DB moved to port 5434" — stored to memory

10:15 AM·Python deploy script

curl localhost:18981/chat → gets the new port automatically

2:00 PM·Telegram

Teammate asks the bot → same answer, same memory

3:00 PM·Cron report

Scheduled status report → includes the DB update

One memory. Four interfaces. Zero configuration.

Get started in minutes

  1. 1Go to the download section above and grab the installer for your platform.
  2. 2Run the installer (.dmg for macOS, .msi for Windows, .deb/.rpm for Linux).
  3. 3Launch Zenii from your applications menu.
  4. 4Configure your first LLM provider in Settings.
  5. Go to downloads

System Architecture

A single 20 MB binary — gateway, core engine, plugin host, and persistent memory — Powered by Rust.

Zenii system architecture diagram showing the client layer, API gateway with 133 routes, core engine with AI orchestration and tool execution, plugin system supporting any language, and the data layer with SQLite and vector storage

6-layer security defense

Concentric ring diagram showing Zenii's 6 layers of defense: sandboxed execution, permission gates, token budgets, rate limiting, audit logging, and model-level safety

Autonomy Modes

Strict

Agent operates within tightly constrained boundaries. Minimal autonomy, maximum oversight.

Supervised

Agent proposes actions. User confirms or rejects each one before execution.

Autonomous

Agent executes autonomously within configured boundaries. Use with caution.

All data stays on your machine. No telemetry, no cloud sync, no account required. Zenii is 0.1.21, actively developed.

Engineering Decisions That Ship

These aren't spec-sheet items. Each is an engineering decision that directly affects your experience.

One brain, five bodies

ALL business logic lives in a single shared core. Desktop, CLI, TUI, and daemon are thin shells. One update improves all four interfaces simultaneously. No feature fragmentation. No "desktop has it but CLI doesn't."

Trait-driven everything

Memory, credentials, AI providers, tools, channels — every major subsystem is behind a Rust trait. Swap the memory backend from SQLite to Postgres. Replace the credential store. Change AI providers. Nothing else needs to change.

Feature-gated lean binary

The default binary ships with the gateway, AI engine, and keyring. Telegram? Discord? Scheduler? API docs? All opt-in via feature flags. You install what you need. The binary doesn't waste a single byte on what you don't.

Hybrid FTS5 + vector scoring

Memory search isn't keyword OR semantic — it's both. Weighted combination of SQLite FTS5 full-text search and sqlite-vec vector embeddings. Memory that understands both what you said and what you meant.

Context auto-discovery

Before every request, Zenii detects which domains are relevant (Channels? Scheduler? Skills? Tools?) and only injects that context into the prompt. Result: ~65% fewer tokens per request than naive "dump everything" approaches. Directly saves you money on API calls.

OS keyring with async fallback

Your API keys live in your OS keyring (macOS Keychain, Windows Credential Manager, Linux Secret Service) — never in a config file, never in a .env. If the keyring isn't available, Zenii falls back gracefully to an in-memory store. Credentials are zeroized from memory when no longer needed.

One codebase. Four interfaces. Zero duplication.

Desktop app
CLI
TUI
Daemon
zenii-core

Every binary — desktop app, CLI, TUI, and daemon — is a thin shell over the same shared core. The desktop binary is 67 lines of Rust. The daemon is 74. All business logic, all 133 routes, all 18 tools, all security layers live in one place.

When we add a feature, every interface gets it. When we fix a bug, it's fixed everywhere. This isn't an accident — it's a strict architectural rule enforced since day one.

Your AI gets smarter. You stay in control.

Most AI tools are static — they do exactly what they did on day one. Some self-modify without asking. Zenii takes a third path:

  1. 1

    Zenii observes your patterns and preferences over time

  2. 2

    Zenii proposes skill modifications — "I notice you always want code reviews on Fridays. Want me to schedule that?"

  3. 3

    You approve or reject — like a PR from your AI

  4. 4

    Zenii learns — approved changes become permanent skills

Your AI gets smarter. You stay in control. No surprises.

Where Zenii fits

FeatureZeniiOpenClawNemoClawZeroClawPicoClawOpen InterpreterKhojGemini CLI
CategoryAI backendChat agentEnterprise security wrapperMinimal daemonEdge AI assistantCode REPLDocument brainTerminal AI
LanguageRustTypeScriptTypeScript + PythonRustGoPythonPython/TSTypeScript
Binary<20 MB (w/ GUI)~100 MB+Docker container (~500 MB+)~3.4 MB<10 MB RAMN/A (Python)N/A (Docker)N/A (npm)
Desktop GUINative (Tauri 2)Web consoleBrowser
API Routes133 REST+WSChat endpointInherits OpenClawDaemon endpointWebhook gateway
PluginsAny languageJS onlyInherits OpenClaw (JS)Rust onlyTool-based
MemoryFTS5 + vectors (machine-wide)Per-agent SQLite, BM25 + vectorsInherits OpenClaw (per-agent)BasicWorkspace logsDoc search
Self-EvolutionHuman-approvedAutonomousInherits OpenClaw (sandboxed)Agent-generated
SchedulingCron + one-shotCronInherits OpenClawBuilt-inAutomations
OfflineOllamaOllamaNVIDIA Nemotron primaryOllamaDuckDuckGoLiteLLMOptionalNo
LicenseMITOpen sourceApache 2.0Open sourceMITAGPL-3.0AGPL-3.0Apache 2.0

Code examples

Integrate from any language. No SDK required.

# Health check
curl http://localhost:18981/health
# → {"status": "ok"}
 
# Create a chat session
SESSION=$(curl -s -X POST http://localhost:18981/sessions \
-H "Content-Type: application/json" \
-d '{"title": "my-project"}' | jq -r '.id')
 
# Send a message
curl -X POST http://localhost:18981/sessions/$SESSION/messages \
-H "Content-Type: application/json" \
-d '{"role": "user", "content": "What tools do you have available?"}'
 
# Chat with the agent (non-streaming)
curl -X POST http://localhost:18981/chat \
-H "Content-Type: application/json" \
-d '{"session_id": "'$SESSION'", "prompt": "Search the web for Rust async patterns"}'

Frequently asked questions

General

Zenii (pronounced "ZEN-ee-eye", /ˈzɛn.iː.aɪ/) is a portmanteau of Zen — the Japanese philosophy of calm mastery — and genii, the Latin plural of genius. It’s a local AI backend that gives every tool on your machine access to AI. Install one 20 MB binary, and your scripts have memory, your cron jobs can reason, and your Telegram bot can think. 133 API routes, native desktop app, plugins in any language. Just Rust.

Yes. MIT licensed. Zero subscriptions, zero accounts, zero telemetry. The infrastructure is free. Cloud AI providers (OpenAI, Anthropic, etc.) charge for API usage, but you can go 100% free with Ollama and local models.

Zenii is at 0.1.21 with 1,500+ tests, zero clippy warnings, and 6-layer security. It’s stable for personal and development use.

No. Download a pre-built binary from GitHub Releases or use the install script. Building from source is for contributors.

Linux, macOS, Windows (x86_64), and ARM64 (including Raspberry Pi 4). Pre-built binaries for all platforms.

Comparisons

Complementary. ChatGPT/Claude are conversations that vanish. Zenii is an AI that lives on your machine — remembers what you told it last month, connects to 6 built-in providers (including those same cloud models), and exposes 133 API routes for everything else on your machine to use.

Open Interpreter lets AI run your code. Zenii lets your code run AI. Open Interpreter is a REPL. Zenii is a backend everything on your machine calls — with 133 API routes, persistent memory, plugins, and scheduling. MIT vs AGPL.

Khoj is a brilliant AI second brain for documents. Zenii is programmable AI infrastructure — agent tools, plugin system, cron scheduling, 133 API routes. Khoj searches. Zenii acts. MIT vs AGPL.

Gemini CLI is a terminal tool locked to Google’s models. Zenii supports 6 built-in providers + any OpenAI-compatible endpoint, adds persistent memory, 133 API routes, a native desktop app, plugins, and scheduling.

OpenClaw is an incredible AI you chat with — massive community, 50+ integrations, per-agent SQLite memory. Zenii is AI infrastructure you build on: 133 API routes any script can call, machine-wide memory shared across every interface, native desktop app in 20 MB, plugins in any language. OpenClaw handles chat; Zenii is the backend every tool on your machine calls. They can work together — an OpenClaw agent calling http://localhost:18981 gets Zenii’s cross-interface memory.

ZeroClaw is a 3.4 MB purist’s dream — 10ms startup, <5MB RAM, runs on a Raspberry Pi. We share the Rust DNA. Zenii makes a different trade-off: 6x larger (20 MB) but adds a native desktop GUI, machine-wide semantic memory with FTS5 + vector search, 133 API routes, plugins in any language (not just Rust), YAML/TOML workflow DAGs, and a cron scheduler. They’re complementary — you can run ZeroClaw as an agent that calls into Zenii’s memory backend.

NemoClaw is NVIDIA’s governance layer on top of OpenClaw — policy-based execution, a privacy router, and Nemotron local models for enterprise environments. As of early 2026 it is alpha and not yet recommended for production by NVIDIA. Zenii is independent infrastructure: 133 API routes, native desktop app, machine-wide persistent memory, workflow DAGs, and plugins in any language — all in 20 MB of Rust. MIT licensed; NemoClaw is Apache 2.0 and requires OpenClaw.

Enterprise workflow platforms — heavyweight (Docker + databases). Zenii is a single 20 MB binary for your machine. They orchestrate workflows. Zenii provides the AI reasoning those workflows can call.

Technical

Yes. Pair with Ollama. Zero network calls. Everything in local SQLite.

Any program that speaks JSON-RPC 2.0 over stdin/stdout is a valid plugin. Install from git or local paths. A Python plugin is ~15 lines.

6 built-in: OpenAI, Anthropic (Claude), Google (Gemini), OpenRouter, Ollama, Vercel AI Gateway. Add any OpenAI-compatible endpoint as a custom provider.

SQLite with dual indexing: FTS5 for full-text search and sqlite-vec for vector embeddings. Hybrid scoring combines both at recall time. Persists across sessions and restarts.

Zenii observes your patterns and proposes skill modifications — like a PR from your AI. You approve or reject. Approved changes become permanent. No unauthorized behavior changes.

Single binary deployment. No runtime dependencies. Memory safety without garbage collection. Async done right (tokio). The binary with a full desktop GUI is under 20 MB. Try that with Python, Node.js, or Go + Electron.

HTTP. Any program that can curl http://localhost:18981 can connect — Python, Go, Bash, Node.js, Ruby, or anything else. 133 REST + WebSocket routes, JSON in, JSON out.

Yes. All interfaces share the same memory, same tools, same AI providers, same everything. There’s one brain behind localhost:18981 — it doesn’t matter which door you walk through.

It’s the shared brain behind all of them. Your scripts, bots, cron jobs, and desktop app all connect to the same intelligence. Instead of 5 disconnected AI tools, you have one backend that remembers everything across all of them.

You're early.

Zenii is in active development toward v1.0. Star the repo to follow the journey — early supporters get credited, and your feedback directly shapes the roadmap.

Download Zenii 0.1.21

Choose the right installer for your platform. Your OS has been auto-detected.

Full GUI application with native window