ChatHarbor

One native app for all your AI.

Download for macOS or build from source on GitHub
Apple Intelligence
Ollama
OpenAI
Anthropic
Google Gemini
Custom Endpoints

Chat with Apple Intelligence, Ollama, OpenAI, Anthropic, Google Gemini, and any OpenAI compatible endpoint side by side in one native macOS window. Compare responses across models, fork conversations, and keep full control over every parameter sent to the API.

⚖️

Multi-Provider

Six providers, one app. Run local models with Ollama and Apple Intelligence, connect to OpenAI, Anthropic, and Google Gemini via API, or add any OpenAI compatible endpoint like LM Studio, LocalAI, or Open WebUI. Switch between them freely.

📊

Compare Models

Send the same prompt to 2–4 models simultaneously and see responses side by side. Cross-provider, real-time streaming, with token and speed metrics.

🔌

Fork Conversations

Branch any conversation at any message. Explore different directions without losing context. Forks nest visually in the sidebar with full ancestry tracking.

👁

Full Transparency

See exactly what's being sent to the API. System prompts, temperature, max tokens, top_p — every parameter is visible, adjustable, and copyable. No hidden magic.

📚

Prompt Library

Built-in templates for coding, writing, research, and more. Create your own custom presets with named profiles that persist per conversation.

🎨

Themes

10 color themes including seasonal variants, each with light and dark mode support. The entire UI adapts to your chosen palette.

🧠

Multi-Model Brainstorming

Run structured brainstorm sessions with multiple AI models playing different roles. Six Hats, SCAMPER, Starbursting, and more. Each model brings a distinct perspective, with automatic phase progression from ideation through synthesis.

New in v2

Ships & Harbor

Build custom AI workspaces called Ships. Each Ship has its own model, personality, knowledge base, and conversation history. Load URLs, documents, and reference text as Cargo. Ships appear in the sidebar and work everywhere — chat, brainstorm, compare. Like custom GPTs, but for every provider.

New in v2.2
🔒

Private by Default

All conversations are stored locally on your Mac using SwiftData. API keys are secured in your macOS Keychain. Local models with Ollama and Apple Intelligence work completely offline — no data leaves your machine.