🤖 ADKBot is a powerful, multi-model AI assistant framework built on Google's Agent Development Kit (ADK) with LiteLLM for universal model support. ADKBot is an ADK-native project, built from the ground up to leverage ADK's agent architecture while preserving and extending a rich tooling and channel ecosystem.
⚡ Use any LLM provider (NVIDIA NIM, Gemini, Groq, OpenRouter, Anthropic, OpenAI, xAI, Ollama, and 50+ more) through a single unified interface.
🔌 Connect to 12+ chat platforms (Telegram, Discord, WhatsApp, Slack, WeChat, and more).
🛠️ Equipped with 10+ built-in tools (web search, file operations, shell commands, scheduled tasks, MCP support, and sub-agent spawning).
🧠 ADK-Powered: Built on Google's Agent Development Kit for robust agent lifecycle management, native callbacks, and session handling.
🌐 Multi-Model: LiteLLM integration means you can use Claude, GPT, Gemini, DeepSeek, Llama, and 50+ other models without changing code.
🔧 Rich Tooling: Web search (5 providers), file operations, shell execution, cron scheduling, MCP protocol support, and sub-agent spawning.
📱 12+ Chat Channels: Telegram, Discord, WhatsApp, WeChat, Feishu, DingTalk, Slack, Matrix, Email, QQ, WeCom, and Mochat.
⏰ Scheduled Tasks: Cron expressions, interval timers, and one-time scheduling with timezone support.
🔒 Security: Workspace sandboxing, command safety guards, SSRF protection, and per-channel access control.
💎 Easy to Use: One command to set up, one command to chat.
- Key Features
- Install
- Quick Start
- Chat Apps
- Configuration
- Multiple Instances
- CLI Reference
- Python SDK
- OpenAI-Compatible API
- Docker
- Linux Service
- Project Structure
- Contributing
With uv (recommended, fast):
uv tool install adkbotWith pip:
pip install adkbotInstall with additional channel extras
The core install includes Telegram and WhatsApp out of the box. For other channels, install the extras you need:
# Install with Discord support
pip install "adkbot[discord]"
# Install with multiple channels
pip install "adkbot[discord,slack,feishu]"
# Install everything (all channels + tools)
pip install "adkbot[all]"Available extras: discord, slack, feishu, dingtalk, qq, mochat, matrix, weixin, wecom, socks, api.
Note: When configuring a channel during
adkbot onboard, the wizard will automatically detect if its SDK is missing and offer to install it for you.
Install from source (for development)
git clone https://github.com/nwokike/ADKbot.git
cd ADKbot
uv venv
# Windows: .venv\Scripts\activate
# macOS/Linux: source .venv/bin/activate
uv sync --all-extrasInstall on Termux (Android)
Python packages with native dependencies can cause build issues inside raw Termux. Use proot-distro to run a proper Linux distribution inside Termux instead.
# Install proot-distro
pkg update && pkg upgrade
pkg install proot-distro
# Install and log into Ubuntu
proot-distro install ubuntu
proot-distro login ubuntu
# Inside Ubuntu: install uv and configure it for Android's filesystem
apt update && apt upgrade -y
apt install curl -y
curl -LsSf [https://astral.sh/uv/install.sh](https://astral.sh/uv/install.sh) | sh
echo 'export UV_LINK_MODE=copy' >> ~/.bashrc
source ~/.bashrc
# Install ADKBot
uv tool install adkbotuv:
uv tool upgrade adkbot
adkbot --versionpip:
pip install -U adkbot
adkbot --version- Python >= 3.11
- An API key from any supported LLM provider
Tip
Get API keys:
- NVIDIA NIM (recommended, completely free, massive open-weight model catalog)
- Google Gemini (free tier available, best ADK integration, try
gemini/gemma-4-31b-itpractically impossible to hit the free limits) - Groq (fastest inference, free tier)
- OpenRouter (access to many models via one key)
- Anthropic (Claude Opus 4.6)
- OpenAI (GPT 5.4)
- xAI (Grok 4.20)
API keys can be set as environment variables (e.g., NVIDIA_NIM_API_KEY=nvapi-xxx) or entered during the wizard.
For web search setup, see Web Search.
1. Initialize
adkbot onboardThis starts the interactive wizard by default. Use adkbot onboard --skip-wizard to create a basic config without the wizard.
2. Configure (~/.adkbot/config.json)
Configure your model using a LiteLLM model string:
{
"agents": {
"defaults": {
"model": "nvidia_nim/nvidia/nemotron-3-super-120b-a12b"
}
}
}LiteLLM model strings work with 100+ providers. Examples:
"nvidia_nim/nvidia/nemotron-3-super-120b-a12b"- NVIDIA NIM (usesNVIDIA_NIM_API_KEY, free)"nvidia_nim/moonshotai/kimi-k2-instruct-0905"- Kimi K2 via NVIDIA NIM (free)"gemini/gemma-4-31b-it"- Google Gemma 4 (UsesGEMINI_API_KEY, generous free tier)"gemini/gemini-3.1-pro-preview"- Google Gemini (usesGEMINI_API_KEY)"groq/llama-3.3-70b-versatile"- Groq (usesGROQ_API_KEY)"anthropic/claude-opus-4-6"- Anthropic Claude (usesANTHROPIC_API_KEY)"openai/gpt-5.4"- OpenAI (usesOPENAI_API_KEY)"openrouter/anthropic/claude-opus-4-6"- OpenRouter gateway (usesOPENROUTER_API_KEY)"xai/grok-4.20-beta-0309-reasoning"- xAI Grok (usesGROK_API_KEY)"deepseek/deepseek-chat"- DeepSeek (usesDEEPSEEK_API_KEY)"ollama/llama3.2"- Local Ollama (no API key needed)
Set your API key as an environment variable (e.g., NVIDIA_NIM_API_KEY=nvapi-xxx) or enter it during the wizard.
Why NVIDIA NIM?
NVIDIA NIM is our recommended default for new users because:
- Completely free with no credit card required
- Hosts hundreds of top open-weight models (Nemotron, Llama 4, Kimi K2, Mistral, Gemma, and more)
- Runs on NVIDIA's own Hopper GPU infrastructure so inference is fast
- Works with LiteLLM out of the box using the
nvidia_nim/prefix
Popular NVIDIA NIM models:
| Model | String | Best for |
|---|---|---|
| Nemotron 3 Super 120B | nvidia_nim/nvidia/nemotron-3-super-120b-a12b |
General reasoning, coding |
| Kimi K2 Instruct | nvidia_nim/moonshotai/kimi-k2-instruct-0905 |
Long context, complex tasks |
| Llama 4 Scout 17B | nvidia_nim/meta/llama-4-scout-17b-16e-instruct |
Fast text generation |
| Gemma 4 27B | nvidia_nim/google/gemma-4-27b-it |
Lightweight general tasks |
Sign up at build.nvidia.com and grab your free API key.
Provider comparison at a glance
| Provider | Free tier | Speed | Model variety | Best for |
|---|---|---|---|---|
| NVIDIA NIM | Yes (completely free) | Fast | Hundreds of open-weight models | Default choice, coding, reasoning |
| Google Gemini | Yes (generous limits) | Fast | Gemini family only | Native ADK integration, huge context |
| Groq | Yes (rate limited) | Fastest | Llama, Mixtral | Low-latency chat |
| OpenRouter | No (pay per token) | Varies | 200+ models from all providers | Access to everything via one key |
| Anthropic | No | Medium | Claude family only | Complex writing, analysis |
| OpenAI | No | Medium | GPT family only | Broad compatibility |
| xAI | Limited | Fast | Grok family only | Reasoning, code |
| Ollama | Yes (local) | Hardware dependent | Any GGUF model | Privacy, offline use |
3. Chat
adkbot agentThat's it! You have a working AI assistant in 2 minutes.
Connect ADKBot to your favorite chat platform. Want to build your own? See the Channel Plugin Guide.
| Channel | What you need |
|---|---|
| Telegram | Bot token from @BotFather |
| Discord | Bot token + Message Content intent |
QR code scan (adkbot channels login whatsapp) |
|
| WeChat (Weixin) | QR code scan (adkbot channels login weixin) |
| Feishu | App ID + App Secret |
| DingTalk | App Key + App Secret |
| Slack | Bot token + App-Level token |
| Matrix | Homeserver URL + Access token |
| IMAP/SMTP credentials | |
| App ID + App Secret | |
| WeCom | Bot ID + Bot Secret |
| Mochat | Claw token (auto-setup available) |
Telegram (Recommended)
1. Create a bot
- Open Telegram, search
@BotFather - Send
/newbot, follow prompts - Copy the token
2. Configure
{
"channels": {
"telegram": {
"enabled": true,
"token": "YOUR_BOT_TOKEN",
"allowFrom": ["YOUR_USER_ID"]
}
}
}You can find your User ID in Telegram settings. Copy without the
@symbol.
3. Run
adkbot gatewayDiscord
1. Create a bot
- Go to https://discord.com/developers/applications
- Create an application → Bot → Add Bot
- Copy the bot token
2. Enable intents
- In Bot settings, enable MESSAGE CONTENT INTENT
3. Get your User ID
- Discord Settings → Advanced → enable Developer Mode
- Right-click your avatar → Copy User ID
4. Configure
{
"channels": {
"discord": {
"enabled": true,
"token": "YOUR_BOT_TOKEN",
"allowFrom": ["YOUR_USER_ID"],
"groupPolicy": "mention"
}
}
}
groupPolicy:"mention"(default — respond when @mentioned),"open"(respond to all messages).
5. Invite the bot
- OAuth2 → URL Generator
- Scopes:
bot - Bot Permissions:
Send Messages,Read Message History - Open the generated invite URL and add the bot to your server
6. Run
adkbot gatewayRequires Node.js ≥18.
1. Link device
adkbot channels login whatsapp
# Scan QR with WhatsApp → Settings → Linked Devices2. Configure
{
"channels": {
"whatsapp": {
"enabled": true,
"allowFrom": ["+1234567890"]
}
}
}3. Run
adkbot gatewayMatrix (Element)
Install Matrix dependencies first:
pip install "adkbot[matrix]"1. Create/choose a Matrix account
Create or reuse a Matrix account on your homeserver (for example matrix.org).
2. Get credentials
You need:
userId(example:@adkbot:matrix.org)accessTokendeviceId(recommended so sync tokens can be restored across restarts)
3. Configure
{
"channels": {
"matrix": {
"enabled": true,
"homeserver": "https://matrix.org",
"userId": "@adkbot:matrix.org",
"accessToken": "syt_xxx",
"deviceId": "ADKBOT01",
"e2eeEnabled": true,
"allowFrom": ["@your_user:matrix.org"],
"groupPolicy": "open"
}
}
}| Option | Description |
|---|---|
allowFrom |
User IDs allowed to interact. Empty denies all; use ["*"] to allow everyone. |
groupPolicy |
open (default), mention, or allowlist. |
e2eeEnabled |
E2EE support (default true). Set false for plaintext-only. |
4. Run
adkbot gatewayMochat (Claw IM)
Uses Socket.IO WebSocket by default, with HTTP polling fallback.
1. Ask ADKBot to set up Mochat for you
Simply send this message to ADKBot:
Read https://raw.githubusercontent.com/nwokike/MoChat/refs/heads/main/skills/adkbot/skill.md and register on MoChat. My Email account is [email protected] Bind me as your owner and DM me on MoChat.
2. Restart gateway
adkbot gatewayManual configuration (advanced)
{
"channels": {
"mochat": {
"enabled": true,
"base_url": "https://mochat.io",
"socket_url": "https://mochat.io",
"socket_path": "/socket.io",
"claw_token": "claw_xxx",
"agent_user_id": "6982abcdef",
"sessions": ["*"],
"panels": ["*"]
}
}
}Feishu
Uses WebSocket long connection — no public IP required.
1. Create a Feishu bot
- Visit Feishu Open Platform
- Create a new app → Enable Bot capability
- Permissions:
im:message,im:message.p2p_msg:readonly,cardkit:card:write - Events: Add
im.message.receive_v1→ Select Long Connection mode - Get App ID and App Secret
- Publish the app
2. Configure
{
"channels": {
"feishu": {
"enabled": true,
"appId": "cli_xxx",
"appSecret": "xxx",
"allowFrom": ["ou_YOUR_OPEN_ID"],
"groupPolicy": "mention",
"streaming": true
}
}
}3. Run
adkbot gatewayDingTalk (钉钉)
Uses Stream Mode — no public IP required.
1. Create a DingTalk bot
- Visit DingTalk Open Platform
- Create a new app → Add Robot capability → Toggle Stream Mode ON
- Get AppKey and AppSecret
2. Configure
{
"channels": {
"dingtalk": {
"enabled": true,
"clientId": "YOUR_APP_KEY",
"clientSecret": "YOUR_APP_SECRET",
"allowFrom": ["YOUR_STAFF_ID"]
}
}
}3. Run
adkbot gatewaySlack
Uses Socket Mode — no public URL required.
1. Create a Slack app
- Go to Slack API → Create New App → "From scratch"
2. Configure the app
- Socket Mode: Toggle ON → Generate an App-Level Token with
connections:writescope → copy it (xapp-...) - OAuth & Permissions: Add bot scopes:
chat:write,reactions:write,app_mentions:read - Event Subscriptions: Toggle ON → Subscribe to:
message.im,message.channels,app_mention - App Home: Enable Messages Tab → Check "Allow users to send Slash commands and messages from the messages tab"
- Install App: Click Install to Workspace → copy the Bot Token (
xoxb-...)
3. Configure ADKBot
{
"channels": {
"slack": {
"enabled": true,
"botToken": "xoxb-...",
"appToken": "xapp-...",
"allowFrom": ["YOUR_SLACK_USER_ID"],
"groupPolicy": "mention"
}
}
}4. Run
adkbot gatewayGive ADKBot its own email account. It polls IMAP for incoming mail and replies via SMTP.
1. Get credentials (Gmail example)
- Create a dedicated Gmail account (e.g.
[email protected]) - Enable 2-Step Verification → Create an App Password
2. Configure
{
"channels": {
"email": {
"enabled": true,
"consentGranted": true,
"imapHost": "imap.gmail.com",
"imapPort": 993,
"imapUsername": "[email protected]",
"imapPassword": "your-app-password",
"smtpHost": "smtp.gmail.com",
"smtpPort": 587,
"smtpUsername": "[email protected]",
"smtpPassword": "your-app-password",
"fromAddress": "[email protected]",
"allowFrom": ["[email protected]"]
}
}
}3. Run
adkbot gatewayQQ (QQ单聊)
Uses botpy SDK with WebSocket — no public IP required. Currently supports private messages only.
1. Register & create bot
- Visit QQ Open Platform → Create a new bot application
- Copy AppID and AppSecret
2. Configure
{
"channels": {
"qq": {
"enabled": true,
"appId": "YOUR_APP_ID",
"secret": "YOUR_APP_SECRET",
"allowFrom": ["YOUR_OPENID"]
}
}
}3. Run
adkbot gatewayWeChat (微信 / Weixin)
Uses HTTP long-poll with QR-code login.
1. Install with WeChat support
pip install "adkbot[weixin]"2. Configure
{
"channels": {
"weixin": {
"enabled": true,
"allowFrom": ["YOUR_WECHAT_USER_ID"]
}
}
}3. Login
adkbot channels login weixin4. Run
adkbot gatewayWeCom (企业微信)
Uses WebSocket long connection — no public IP required.
1. Install
pip install adkbot[wecom]2. Configure
{
"channels": {
"wecom": {
"enabled": true,
"botId": "your_bot_id",
"secret": "your_bot_secret",
"allowFrom": ["your_id"]
}
}
}3. Run
adkbot gatewayConfig file: ~/.adkbot/config.json
ADKBot uses LiteLLM under the hood, which means it supports 100+ LLM providers through a unified interface. Simply specify the model using a LiteLLM model string.
Tip
- Groq provides free voice transcription via Whisper. If configured, Telegram voice messages will be automatically transcribed.
- For local models, use
ollamaorvllmmodel strings.
| Provider | LiteLLM Model String | API Key Environment Variable |
|---|---|---|
| Google Gemini | gemini/gemini-3.1-pro-preview |
GEMINI_API_KEY |
| NVIDIA NIM | nvidia_nim/nvidia/nemotron-3-super-120b-a12b |
NVIDIA_NIM_API_KEY |
| Groq | groq/llama-3.3-70b-versatile |
GROQ_API_KEY |
| Anthropic Claude | anthropic/claude-opus-4-6 |
ANTHROPIC_API_KEY |
| OpenAI | openai/gpt-5.4 |
OPENAI_API_KEY |
| OpenRouter | openrouter/anthropic/claude-opus-4-6 |
OPENROUTER_API_KEY |
| xAI (Grok) | xai/grok-4.20-beta-0309-reasoning |
GROK_API_KEY |
| DeepSeek | deepseek/deepseek-chat |
DEEPSEEK_API_KEY |
| Ollama (local) | ollama/llama3.2 |
None |
| vLLM (local) | openai/meta-llama/Llama-3.1-8B-Instruct + apiBase |
Any (e.g., dummy) |
Gemini is a first-class citizen. ADKBot uses Google ADK natively, so Gemini models get the best possible integration.
LiteLLM Model String Format
LiteLLM uses the format provider/model-name or just model-name for native providers:
{
"agents": {
"defaults": {
"model": "gemini/gemini-3.1-pro-preview",
"apiKey": "",
"apiBase": null
}
}
}- model: LiteLLM model string (e.g.,
gemini/gemini-3.1-pro-preview,nvidia_nim/nvidia/nemotron-3-super-120b-a12b) - apiKey: Optional API key. If empty, uses environment variables
- apiBase: Optional custom API base URL (for self-hosted endpoints)
Examples:
"gemini/gemini-3.1-pro-preview"- Google Gemini (usesGEMINI_API_KEY)"nvidia_nim/nvidia/nemotron-3-super-120b-a12b"- NVIDIA NIM (usesNVIDIA_NIM_API_KEY)"anthropic/claude-opus-4-6"- Anthropic Claude (usesANTHROPIC_API_KEY)"ollama/llama3.2"- Local Ollama (no API key needed)
For 100+ providers, see: https://docs.litellm.ai/docs/providers
Ollama (local)
Run a local model with Ollama:
1. Start Ollama:
ollama run llama3.22. Add to config:
{
"agents": {
"defaults": {
"model": "ollama/llama3.2"
}
}
}vLLM (local / OpenAI-compatible)
Run your own model with vLLM or any OpenAI-compatible server:
1. Start the server:
vllm serve meta-llama/Llama-3.1-8B-Instruct --port 80002. Add to config:
{
"agents": {
"defaults": {
"model": "openai/meta-llama/Llama-3.1-8B-Instruct",
"apiKey": "dummy",
"apiBase": "http://localhost:8000/v1"
}
}
}For local servers that don't require a key, set
apiKeyto any non-empty string (e.g.,"dummy").
Custom/OpenAI-compatible Endpoint
Connect to any OpenAI-compatible endpoint (LM Studio, llama.cpp, Together AI, Fireworks):
{
"agents": {
"defaults": {
"model": "openai/your-model-name",
"apiKey": "your-api-key",
"apiBase": "https://api.your-provider.com/v1"
}
}
}Global settings that apply to all channels:
{
"channels": {
"sendProgress": true,
"sendToolHints": false,
"sendMaxRetries": 3,
"telegram": { "..." : "..." }
}
}| Setting | Default | Description |
|---|---|---|
sendProgress |
true |
Stream agent's text progress to the channel |
sendToolHints |
false |
Stream tool-call hints (e.g. read_file("…")) |
sendMaxRetries |
3 |
Max delivery attempts per outbound message |
Tip
Use proxy in tools.web to route all web requests through a proxy:
{ "tools": { "web": { "proxy": "http://127.0.0.1:7890" } } }ADKBot supports multiple web search providers. Configure in ~/.adkbot/config.json under tools.web.search.
| Provider | Config fields | Env var fallback | Free |
|---|---|---|---|
brave (default) |
apiKey |
BRAVE_API_KEY |
No |
tavily |
apiKey |
TAVILY_API_KEY |
No |
jina |
apiKey |
JINA_API_KEY |
Free tier (10M tokens) |
searxng |
baseUrl |
SEARXNG_BASE_URL |
Yes (self-hosted) |
duckduckgo |
— | — | Yes |
When credentials are missing, ADKBot automatically falls back to DuckDuckGo.
Search provider examples
Brave (default):
{
"tools": { "web": { "search": { "provider": "brave", "apiKey": "BSA..." } } }
}Tavily:
{
"tools": { "web": { "search": { "provider": "tavily", "apiKey": "tvly-..." } } }
}DuckDuckGo (zero config):
{
"tools": { "web": { "search": { "provider": "duckduckgo" } } }
}Tip
The config format is compatible with Claude Desktop / Cursor. You can copy MCP server configs directly from any MCP server's README.
ADKBot supports MCP — connect external tool servers and use them as native agent tools.
{
"tools": {
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/dir"]
},
"my-remote-mcp": {
"url": "https://example.com/mcp/",
"headers": { "Authorization": "Bearer xxxxx" }
}
}
}
}| Mode | Config | Example |
|---|---|---|
| Stdio | command + args |
Local process via npx / uvx |
| HTTP | url + headers (optional) |
Remote endpoint |
MCP tools are automatically discovered and registered on startup. The LLM can use them alongside built-in tools — no extra configuration needed.
Tip
For production deployments, set "restrictToWorkspace": true to sandbox the agent.
| Option | Default | Description |
|---|---|---|
tools.restrictToWorkspace |
false |
Restricts all tools to the workspace directory |
tools.exec.enable |
true |
When false, disables shell command execution entirely |
tools.exec.pathAppend |
"" |
Extra directories to append to PATH for shell commands |
channels.*.allowFrom |
[] (deny all) |
Whitelist of user IDs. Use ["*"] to allow everyone |
By default, ADKBot uses UTC. Set agents.defaults.timezone to your local timezone:
{
"agents": {
"defaults": {
"timezone": "Asia/Shanghai"
}
}
}This affects runtime time context, cron schedule defaults, and one-shot at times.
Common examples: UTC, America/New_York, America/Los_Angeles, Europe/London, Asia/Tokyo, Asia/Shanghai.
Run multiple ADKBot instances simultaneously with separate configs and runtime data.
# Create separate instance configs
adkbot onboard --config ~/.adkbot-telegram/config.json --workspace ~/.adkbot-telegram/workspace
adkbot onboard --config ~/.adkbot-discord/config.json --workspace ~/.adkbot-discord/workspaceRun instances:
# Instance A - Telegram bot
adkbot gateway --config ~/.adkbot-telegram/config.json
# Instance B - Discord bot
adkbot gateway --config ~/.adkbot-discord/config.json| Component | Resolved From | Example |
|---|---|---|
| Config | --config path |
~/.adkbot-A/config.json |
| Workspace | --workspace or config |
~/.adkbot-A/workspace/ |
| Cron Jobs | config directory | ~/.adkbot-A/cron/ |
| Media / state | config directory | ~/.adkbot-A/media/ |
- Each instance must use a different port if they run concurrently
- Use a different workspace per instance for isolated memory and sessions
--workspaceoverrides the workspace defined in the config file
View Common Commands
| Command | Description |
|---|---|
adkbot onboard |
Initialize config & workspace at ~/.adkbot/ |
adkbot onboard --wizard |
Launch the interactive onboarding wizard |
adkbot agent -m "..." |
Chat with the agent |
adkbot agent |
Interactive chat mode |
adkbot gateway |
Start the gateway (connects to chat channels) |
adkbot status |
Show status |
adkbot channels login <channel> |
Authenticate a channel interactively |
Interactive mode exits: exit, quit, /exit, /quit, :q, or Ctrl+D.
For a full list of commands and options, see the Comprehensive CLI Reference.
Heartbeat (Periodic Tasks)
The gateway wakes up every 30 minutes and checks HEARTBEAT.md in your workspace (~/.adkbot/workspace/HEARTBEAT.md). If the file has tasks, the agent executes them and delivers results to your most recently active chat channel.
Setup: edit ~/.adkbot/workspace/HEARTBEAT.md:
## Periodic Tasks
- [ ] Check weather forecast and send a summary
- [ ] Scan inbox for urgent emailsThe agent can also manage this file itself — ask it to "add a periodic task" and it will update HEARTBEAT.md for you.
Note: The gateway must be running (
adkbot gateway) and you must have chatted with the bot at least once.
Use ADKBot as a library — no CLI, no gateway, just Python:
from adkbot import AdkBot
bot = AdkBot.from_config()
result = await bot.run("Summarize the README")
print(result.content)Each call carries a session_id for conversation isolation — different IDs get independent history:
await bot.run("hi", session_id="user-alice")
await bot.run("hi", session_id="task-42")ADKBot uses ADK's native callback system for lifecycle hooks:
# Callbacks are configured via the Agent's before/after hooks
# See adkbot/agent/callbacks.py for the full callback APIADKBot can expose a minimal OpenAI-compatible endpoint for local integrations:
pip install "adkbot[api]"
adkbot serveBy default, the API binds to 127.0.0.1:8900.
GET /healthGET /v1/modelsPOST /v1/chat/completions
curl http://127.0.0.1:8900/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"messages": [{"role": "user", "content": "hi"}],
"session_id": "my-session"
}'from openai import OpenAI
client = OpenAI(
base_url="http://127.0.0.1:8900/v1",
api_key="dummy",
)
resp = client.chat.completions.create(
model="adkbot",
messages=[{"role": "user", "content": "hi"}],
extra_body={"session_id": "my-session"},
)
print(resp.choices[0].message.content)Tip
The -v ~/.adkbot:/root/.adkbot flag mounts your local config directory into the container for persistence.
docker compose run --rm adkbot-cli onboard # first-time setup
vim ~/.adkbot/config.json # add API keys
docker compose up -d adkbot-gateway # start gatewaydocker compose run --rm adkbot-cli agent -m "Hello!" # run CLI
docker compose logs -f adkbot-gateway # view logs
docker compose down # stop# Build the image
docker build -t adkbot .
# Initialize config (first time only)
docker run -v ~/.adkbot:/root/.adkbot --rm adkbot onboard
# Edit config on host to add API keys
vim ~/.adkbot/config.json
# Run gateway
docker run -v ~/.adkbot:/root/.adkbot -p 18790:18790 adkbot gateway
# Or run a single command
docker run -v ~/.adkbot:/root/.adkbot --rm adkbot agent -m "Hello!"You can automatically run the gateway in the background on system boot using the built-in systemd installer.
1. Install and start the system service:
adkbot install-serviceNote: To keep the gateway running after you log out of SSH, enable user lingering:
loginctl enable-linger $USER
Common operations:
systemctl --user status adkbot-gateway # check status
systemctl --user restart adkbot-gateway # restart after config changes
journalctl --user -u adkbot-gateway -f # follow logsadkbot/
├── agent/ # 🧠 Core agent (ADK Agent + Runner)
│ ├── callbacks.py# ADK lifecycle callbacks
│ ├── context.py # Prompt builder
│ ├── memory.py # Persistent memory
│ ├── skills.py # Skills loader
│ ├── subagent.py # Background task execution
│ └── tools/ # Built-in tools (10+ tools)
├── adkbot.py # 🤖 Main AdkBot class (ADK Agent + Runner + LiteLLM)
├── skills/ # 🎯 Bundled skills
├── channels/ # 📱 Chat channel integrations (12+ channels)
├── bus/ # 🚌 Message routing
├── cron/ # ⏰ Scheduled tasks
├── heartbeat/ # 💓 Proactive wake-up
├── session/ # 💬 Conversation sessions
├── config/ # ⚙️ Configuration
├── security/ # 🔒 Safety guards & SSRF protection
└── cli/ # 🖥️ CLI commands
PRs welcome! The codebase is intentionally readable and well-structured. 🤗
Roadmap:
- Multi-modal — See and hear (images, voice, video)
- Long-term memory — Never forget important context
- Better reasoning — Multi-step planning and reflection
- More integrations — Calendar, GitHub, and more
- ADK Web UI — Built-in web interface via
adk web - Self-improvement — Learn from feedback and mistakes
By Kiri Research Labs
Inspired by OpenClaw
ADKBot is for educational, research, and technical exchange purposes only
