The agentic shell for building and running AI teams from the command line.
npcsh makes the most of multi-modal LLMs and agents through slash commands and interactive modes, all from the command line. Build teams of agents, schedule them on jobs, engineer context, and design custom Jinja Execution templates (Jinxes) for you and your agents to invoke.
pip install 'npcsh[lite]'Once installed, run npcsh to enter the NPC shell. Also provides the CLI tools npc, wander, spool, yap, and nql.
.npc and .jinx files are directly executable with shebangs (#!/usr/bin/env npc):
npc ./myagent.npc "summarize this repo" # run an NPC with a prompt
npc ./script.jinx bash_command="ls -la" # run a jinx directly
./myagent.npc "hello" # or just execute it (with shebang)How well can a model drive npcsh as an agentic shell? 125 tasks across 15 categories — from basic shell commands to multi-step workflows, code debugging, and tool chaining — scored pass/fail. Comparisons with other agent coders coming soon.
| Family | Model | Score |
|---|---|---|
| Qwen3.5 | 0.8b | 31/125 (24%) |
| 2b | 81/125 (65%) | |
| 4b | 77/125 (62%) | |
| 9b | 100/125 (80%) | |
| 35b | 111/125 (88%) | |
| Qwen3 | 0.6b | — |
| 1.7b | 42/125 (34%) | |
| 4b | 94/125 (75%) | |
| 8b | 85/125 (68%) | |
| 30b | 103/125 (82%) | |
| Gemma3 | 1b | — |
| 4b | 37/125 (30%) | |
| 12b | 77/125 (62%) | |
| 27b | 73/125 (58%) | |
| Llama | 3.2:1b | — |
| 3.2:3b | 26/125 (20%) | |
| 3.1:8b | 60/125 (48%) | |
| Mistral | small3.2 | 72/125 (57%) |
| ministral-3 | 51/125 (40%) | |
| Phi | phi4 | 58/125 (46%) |
| GPT-OSS | 20b | 94/125 (75%) |
| OLMo2 | 7b | 13/125 (10%) |
| 13b | 47/125 (38%) | |
| Cogito | 3b | 10/125 (8%) |
| GLM | 4.7-flash | 102/125 (82%) |
| Gemini | 2.5-flash | — |
| 3.1-flash | — | |
| 3.1-pro | — | |
| Claude | 4.6-sonnet | — |
| 4.5-haiku | — | |
| GPT | 5-mini | — |
| DeepSeek | chat | — |
| reasoner | — |
Category breakdown (completed models)
| Category | Qwen3.5 | Qwen3 | Gemma3 | Llama | Mistral | Phi | GPT-OSS | Cogito | GLM | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.8b | 2b | 9b | 35b | 1.7b | 4b | 8b | 30b | 0.6b | 4b | 12b | 27b | 3.2:3b | small3.2 | ministral-3 | phi4 | 20b | 3b | 4.7-flash | |
| shell (10) | 5 | 6 | 10 | 10 | 8 | 8 | 9 | 9 | — | 6 | 6 | 9 | 6 | 10 | 7 | 9 | 10 | 0 | 10 |
| file-ops (10) | 8 | 9 | 10 | 10 | 8 | 10 | 9 | 10 | — | 6 | 9 | 10 | 2 | 6 | 10 | 10 | 10 | 0 | 10 |
| python (10) | 0 | 3 | 9 | 10 | 0 | 5 | 6 | 6 | — | 0 | 3 | 1 | 0 | 3 | 6 | 4 | 10 | 0 | 10 |
| data (10) | 0 | 2 | 4 | 6 | 2 | 4 | 5 | 6 | — | 1 | 5 | 7 | 0 | 5 | 9 | 4 | 6 | 0 | 5 |
| system (10) | 2 | 8 | 9 | 10 | 7 | 9 | 7 | 10 | — | 5 | 9 | 7 | 2 | 9 | 6 | 6 | 9 | 0 | 10 |
| text (10) | 1 | 7 | 6 | 8 | 2 | 10 | 6 | 7 | — | 3 | 9 | 8 | 1 | 7 | 0 | 4 | 8 | 0 | 7 |
| debug (10) | 2 | 6 | 10 | 10 | 0 | 4 | 2 | 10 | — | 0 | 3 | 0 | 0 | 4 | 0 | 0 | 9 | 0 | 9 |
| git (10) | 0 | 8 | 6 | 9 | 2 | 9 | 9 | 8 | — | 4 | 6 | 9 | 4 | 8 | 4 | 6 | 8 | 0 | 5 |
| multi-step (10) | 0 | 6 | 7 | 6 | 0 | 6 | 3 | 7 | — | 3 | 5 | 5 | 2 | 3 | 0 | 5 | 4 | 0 | 5 |
| scripting (10) | 1 | 5 | 8 | 10 | 0 | 7 | 2 | 6 | — | 0 | 2 | 1 | 0 | 3 | 1 | 3 | 7 | 0 | 8 |
| image-gen (5) | 5 | 5 | 5 | 5 | 5 | 5 | 5 | 5 | — | 3 | 5 | 3 | 5 | 5 | 1 | 2 | 5 | 5 | 5 |
| audio-gen (5) | 5 | 4 | 5 | 5 | 5 | 5 | 5 | 5 | — | 4 | 5 | 5 | 4 | 5 | 1 | 5 | 5 | 5 | 5 |
| web-search (5) | 1 | 5 | 4 | 5 | 1 | 5 | 4 | 5 | — | 1 | 5 | 5 | 0 | 4 | 5 | 0 | 3 | 0 | 5 |
| delegation (5) | 0 | 2 | 3 | 3 | 0 | 2 | 2 | 4 | — | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
| tool-chain (5) | 1 | 5 | 4 | 4 | 2 | 5 | 2 | 5 | — | 1 | 3 | 3 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
| Total (125) | 31 | 81 | 100 | 111 | 42 | 94 | 76 | 103 | — | 37 | 77 | 73 | 26 | 72 | 51 | 58 | 94 | 10 | 102 |
python -m npcsh.benchmark.local_runner --model qwen3:4b --provider ollama-
Get help with a task:
npcsh>can you help me identify what process is listening on port 5337?
-
Edit files:
npcsh>please read through the markdown files in the docs folder and suggest changes
-
Search & Knowledge
/web_search "cerulean city" # Web search /db_search "query" # Database search /file_search "pattern" # File search /memories # Interactive memory browser TUI /kg # Interactive knowledge graph TUI /nql # Database query TUI
-
Computer Use
/computer_use
-
Generate Images
/vixynt 'generate an image of a rabbit eating ham in the brink of dawn' model='gpt-image-1' provider='openai'
-
Generate Videos
/roll 'generate a video of a hat riding a dog' veo-3.1-fast-generate-preview gemini -
Multi-Agent Discussions
/convene "Is the universe a simulation?" npcs=alicanto,corca,guac rounds=3 -
Serve an NPC Team
/serve --port 5337 --cors='http://localhost:5137/'
npcsh supports multiple ways to define agents inside your npc_team/ directory. You can mix all three formats — .npc files take precedence if names collide.
.npc files — Full-featured YAML agent definitions with model, provider, jinxes, and more:
#!/usr/bin/env npc
name: analyst
primary_directive: You analyze data and provide insights.
model: qwen3:8b
provider: ollama
jinxes:
- skills/data-analysisagents.md — Define multiple agents in a single markdown file. Each ## heading = agent name, body = directive:
## summarizer
You summarize long documents into concise bullet points.
## fact_checker
You verify claims against reliable sources and flag inaccuracies.agents/ directory — One .md file per agent. Filename (minus .md) = agent name. Supports YAML frontmatter:
---
model: gemini-2.5-flash
provider: gemini
---
You translate content between languages while preserving tone and idiom.All three formats are supported by both the Python and Rust editions of npcsh. Agents from agents.md and agents/ inherit the team's default model/provider from team.ctx.
The full team structure:
npc_team/
├── team.ctx # Team config (model, provider, forenpc, context)
├── coordinator.npc # YAML agent definitions
├── analyst.npc
├── agents.md # Markdown-defined agents
├── agents/ # One .md file per agent
│ └── translator.md
├── jinxes/ # Workflows and tools
│ ├── research.jinx
│ └── skills/ # Knowledge-content skills
└── tools/ # Custom tool functions
This means you can bring agents from other ecosystems — if you already have an agents.md or an agents/ directory from Claude Code, Codex, Amp, or any other tool, just drop them into your npc_team/ and npcsh will pick them up alongside your .npc files.
Your npc_team/ works beyond npcsh — you can launch any major AI coding tool as an NPC from your team using the CLI launchers from npcpy. Each tool gets the NPC's persona injected and gains awareness of the other team members.
pip install npcpy # if not already installed
# Launch Claude Code as an NPC (interactive picker)
npc-claude
# Launch as a specific NPC
npc-claude --npc corca
# Same for other coding tools
npc-codex --npc researcher
npc-gemini --npc analyst
npc-opencode --npc coder
npc-aider --npc reviewer
npc-amp --npc writer
# Point to a specific team directory
npc-claude --team ./my_project/npc_teamThe launcher discovers your team from ./npc_team or ~/.npcsh/npc_team, lets you pick an NPC, and starts the tool with that NPC's directive. For Claude Code, it also passes the other NPCs as sub-agents via --agents.
For deeper integration (jinxes exposed as MCP tools, team switching mid-conversation), register the NPC plugin:
npc-plugin claude # install MCP server + hooks
npc-plugin codex # same for Codex
npc-plugin gemini # same for Gemini CLI- Agents (NPCs) — AI agents with personas, directives, and tool sets
- Team Orchestration — Delegation, review loops, multi-NPC discussions
- Jinxes — Jinja Execution templates — reusable tools for users and agents
- Skills — Knowledge-content jinxes with progressive section disclosure
- NQL — SQL models with embedded AI functions (Snowflake, BigQuery, Databricks, SQLite)
- Knowledge Graphs — Build and evolve knowledge graphs from conversations
- Deep Research — Multi-agent hypothesis generation, persona sub-agents, paper writing
- Computer Use — GUI automation with vision
- Image, Audio & Video — Generation via Ollama, diffusers, OpenAI, Gemini
- MCP Integration — Full MCP server support with agentic shell TUI
- API Server — Serve teams via OpenAI-compatible REST API
Works with all major LLM providers through LiteLLM: ollama, openai, anthropic, gemini, deepseek, openai-like, and more.
pip install 'npcsh[lite]' # API providers (ollama, gemini, anthropic, openai, etc.)
pip install 'npcsh[local]' # Local models (diffusers/transformers/torch)
pip install 'npcsh[yap]' # Voice mode
pip install 'npcsh[all]' # EverythingSystem dependencies
Linux:
sudo apt-get install espeak portaudio19-dev python3-pyaudio ffmpeg libcairo2-dev libgirepository1.0-dev
curl -fsSL https://ollama.com/install.sh | sh
ollama pull qwen3.5:2bmacOS:
brew install portaudio ffmpeg pygobject3 ollama
brew services start ollama
ollama pull qwen3.5:2bWindows: Install Ollama and ffmpeg, then ollama pull qwen3.5:2b.
API keys go in a .env file:
export OPENAI_API_KEY="your_key"
export ANTHROPIC_API_KEY="your_key"
export GEMINI_API_KEY="your_key"A native Rust build of npcsh is available — same shell, same DB, same team files, faster startup. Still experimental.
cd npcsh/rust && cargo build --release
cp target/release/npcsh ~/.local/bin/npc # or wherever you wantBoth editions share ~/npcsh_history.db and ~/.npcsh/npc_team/ and can be used interchangeably.
Full documentation, guides, and API reference at npc-shell.readthedocs.io.
- npcpy — Python framework for building AI agents and teams
- Incognide — Desktop workspace for the NPC Toolkit (download)
- Newsletter — Stay in the loop
- Quantum-like nature of natural language interpretation: arxiv, accepted at QNLP 2025
- Simulating hormonal cycles for AI: arxiv
Discord | Monthly donation | Merch | Consulting: [email protected]
Contributions welcome! Submit issues and pull requests on the GitHub repository.
MIT License.





