oAI connects OpenAI, Anthropic, OpenRouter, Google, and local Ollama models into a single, beautifully native macOS experience — with smart memory, file access, and Git sync built in.
Switch between OpenAI, Anthropic, OpenRouter (300+ models), Google Gemini, and local Ollama — all with a single keyboard shortcut. No context lost, no restart needed.
Reduce token costs by 50–80% with intelligent context selection, message starring, and progressive summarisation of long conversations.
Find any conversation by meaning, not just keywords. AI embeddings let you search across your entire history with natural language.
Grant the AI read (and optionally write) access to specific folders. Ask questions about your own code, notes, and documents directly.
Let the AI run shell commands via /bin/zsh. Opt-in with an approval gate — review every command before it runs, or grant session-wide trust.
One-click backup of all settings to iCloud Drive. Restore on any Mac in seconds. API keys are excluded for security and must be re-entered after restoring.
Conversations export as Markdown and push to your Git repo automatically — free backup and cross-device sync with zero infrastructure required.
Enable web search via DuckDuckGo or Google so the AI can pull real-time information alongside its training knowledge. Works with all providers.
Connect to your self-hosted document archive. Search, read, and upload documents from the chat — your entire paper trail, AI-accessible.
Monitor your inbox via IMAP and let the AI automatically respond to emails with a specific subject tag. Fully configurable with rate limiting and AES-256 encrypted credentials.
Type / to summon autocomplete for every feature. No hunting through menus — just type and go.
Read the full documentation to set up your first API key and start chatting in under a minute.