High-performance Go proxy that exposes Anthropic, Gemini, and OpenAI-compatible APIs. By default it forwards requests to GitHub Copilot's backend (api.githubcopilot.com), and it can also route selected models to configured providers such as Azure OpenAI behind the same proxy endpoint. This lets you use tools that speak the Anthropic, Gemini, or OpenAI protocol with your GitHub Copilot subscription, while still exposing additional provider-backed models when configured.
- Anthropic Messages API
- Gemini Generate Content and Count Tokens APIs
- OpenAI Chat Completions API
- OpenAI Responses API, including Codex websocket bridging
- Multi-provider model routing, including Azure OpenAI deployments behind the same
/v1endpoint - Proxy-owned Codex compatibility endpoints for compaction and memory summarization
- Streaming, tool use, parallel tool calls, compressed request bodies, and OAuth token caching
Download the latest binary for your platform from GitHub Releases, then run it locally.
Or with Docker from GHCR:
docker run -p 1337:1337 \
-v ~/.config/vekil:/home/nonroot/.config/vekil \
ghcr.io/sozercan/vekil:latestOn Apple Silicon Macs, you can also use the native menubar app.
brew install --cask sozercan/repo/vekilNote: The app is not signed. Clear extended attributes, including quarantine, with:
xattr -cr /Applications/Vekil.app
Manual downloads still work through the vekil-macos-arm64.zip asset on GitHub Releases. See macOS Menubar App.
If your setup includes a Copilot provider, first run starts GitHub's device code flow. Tokens are cached in ~/.config/vekil/.
For multi-provider setup and non-Copilot-only deployments, see Configuration.
The full documentation now lives under docs/ in smaller, topic-focused files:
- Docs Index
- Getting Started
- Configuration
- Client Usage Examples
- API Reference
- Architecture
- macOS Menubar App
- Development
env ANTHROPIC_BASE_URL=http://localhost:1337 \
ANTHROPIC_API_KEY=dummy \
claude --model claude-sonnet-4 --print --output-format text "Reply with exactly PROXY_OK"env OPENAI_API_KEY=dummy \
OPENAI_BASE_URL=http://localhost:1337/v1 \
codex exec --skip-git-repo-check -m gpt-5.4 "Reply with exactly PROXY_OK"env GEMINI_API_KEY=dummy \
GOOGLE_GEMINI_BASE_URL=http://localhost:1337 \
GOOGLE_GENAI_API_VERSION=v1beta \
GEMINI_CLI_NO_RELAUNCH=true \
gemini -m gemini-2.5-pro -p "Reply with exactly PROXY_OK" -o jsonMIT