Chat with Apple Intelligence, Ollama, OpenAI, Anthropic, Google Gemini, and any OpenAI compatible endpoint side by side in one native macOS window. Compare responses across models, fork conversations, and keep full control over every parameter sent to the API.
Multi-Provider
Six providers, one app. Run local models with Ollama and Apple Intelligence, connect to OpenAI, Anthropic, and Google Gemini via API, or add any OpenAI compatible endpoint like LM Studio, LocalAI, or Open WebUI. Switch between them freely.
Compare Models
Send the same prompt to 2–4 models simultaneously and see responses side by side. Cross-provider, real-time streaming, with token and speed metrics.
Fork Conversations
Branch any conversation at any message. Explore different directions without losing context. Forks nest visually in the sidebar with full ancestry tracking.
Full Transparency
See exactly what's being sent to the API. System prompts, temperature, max tokens, top_p — every parameter is visible, adjustable, and copyable. No hidden magic.
Prompt Library
Built-in templates for coding, writing, research, and more. Create your own custom presets with named profiles that persist per conversation.
Themes
10 color themes including seasonal variants, each with light and dark mode support. The entire UI adapts to your chosen palette.
Private by Default
All conversations are stored locally on your Mac using SwiftData. API keys are secured in your macOS Keychain. Local models with Ollama and Apple Intelligence work completely offline — no data leaves your machine.