How can we help?

Browse the FAQ below or send us a message — we typically respond when we can.

Send a message

Fill in the form and we'll get back to you as soon as possible.

Clicking Send will open your mail app with everything pre-filled.

✉️

Almost there!

Your mail app should have opened with a pre-filled message. Just hit Send and we'll get back to you as soon as possible.

Common Questions

Quick answers to frequent issues

Press ⌘, to open Settings, go to the Providers tab, and paste your key. Keys are stored securely in the macOS Keychain. You need at least one key from OpenAI, Anthropic, OpenRouter, or Google to get started.
oAI supports 300+ models via OpenRouter, all Claude models via Anthropic, all GPT models via OpenAI, Gemini via Google, and any locally running model via Ollama — with no API key required for local models.
Smart Context intelligently selects which messages to include in each request instead of sending your entire history. It can reduce token usage (and therefore cost) by 50–80% while keeping the AI well-informed. Enable it in Settings → Advanced → Smart Context Selection.
Use MCP (Model Context Protocol). Type /mcp on to enable it, then /mcp add ~/YourFolder to grant access to a specific folder. The AI can then read files and answer questions about your code and documents. Write access is disabled by default — enable it with /mcp write on.
Yes — install Ollama and pull any local model (e.g. ollama pull llama3.2). oAI will detect it automatically. No API key or internet connection required for local models.
oAI exports conversations as Markdown files and pushes them to a Git repository you configure. Enable it in Settings → Git Sync. Works with GitHub, GitLab, Gitea, Bitbucket, or any self-hosted Git host. Your credentials are stored encrypted in the Keychain.
oAI requires macOS 14 Sonoma or later. It is built with SwiftUI and takes advantage of modern macOS APIs for the best native experience.