Skip to content

sozercan/vekil

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

197 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

vekil

High-performance Go proxy that exposes Anthropic, Gemini, and OpenAI-compatible APIs. By default it forwards requests to GitHub Copilot's backend (api.githubcopilot.com), and it can also route selected models to configured providers such as Azure OpenAI behind the same proxy endpoint. This lets you use tools that speak the Anthropic, Gemini, or OpenAI protocol with your GitHub Copilot subscription, while still exposing additional provider-backed models when configured.

What It Supports

  • Anthropic Messages API
  • Gemini Generate Content and Count Tokens APIs
  • OpenAI Chat Completions API
  • OpenAI Responses API, including Codex websocket bridging
  • Multi-provider model routing, including Azure OpenAI deployments behind the same /v1 endpoint
  • Proxy-owned Codex compatibility endpoints for compaction and memory summarization
  • Streaming, tool use, parallel tool calls, compressed request bodies, and OAuth token caching

Quick Start

Download the latest binary for your platform from GitHub Releases, then run it locally.

Or with Docker from GHCR:

docker run -p 1337:1337 \
  -v ~/.config/vekil:/home/nonroot/.config/vekil \
  ghcr.io/sozercan/vekil:latest

On Apple Silicon Macs, you can also use the native menubar app.

brew install --cask sozercan/repo/vekil

Note: The app is not signed. Clear extended attributes, including quarantine, with:

xattr -cr /Applications/Vekil.app

Manual downloads still work through the vekil-macos-arm64.zip asset on GitHub Releases. See macOS Menubar App.

If your setup includes a Copilot provider, first run starts GitHub's device code flow. Tokens are cached in ~/.config/vekil/.

For multi-provider setup and non-Copilot-only deployments, see Configuration.

Docs

The full documentation now lives under docs/ in smaller, topic-focused files:

Most Common Client Setup

Claude Code

env ANTHROPIC_BASE_URL=http://localhost:1337 \
  ANTHROPIC_API_KEY=dummy \
  claude --model claude-sonnet-4 --print --output-format text "Reply with exactly PROXY_OK"

OpenAI Codex CLI

env OPENAI_API_KEY=dummy \
  OPENAI_BASE_URL=http://localhost:1337/v1 \
  codex exec --skip-git-repo-check -m gpt-5.4 "Reply with exactly PROXY_OK"

Gemini CLI

env GEMINI_API_KEY=dummy \
  GOOGLE_GEMINI_BASE_URL=http://localhost:1337 \
  GOOGLE_GENAI_API_VERSION=v1beta \
  GEMINI_CLI_NO_RELAUNCH=true \
  gemini -m gemini-2.5-pro -p "Reply with exactly PROXY_OK" -o json

License

MIT

About

๐Ÿง‘โ€๐Ÿ’ผ Proxy to use Claude, Gemini, or any OpenAI-compatible client with GitHub Copilot and other providers

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages