Skip to content

NPC-Worldwide/npcsh

Repository files navigation

npcsh logo

npcsh

The agentic shell for building and running AI teams from the command line.

License PyPI Python Docs


npcsh makes the most of multi-modal LLMs and agents through slash commands and interactive modes, all from the command line. Build teams of agents, schedule them on jobs, engineer context, and design custom Jinja Execution templates (Jinxes) for you and your agents to invoke.

pip install 'npcsh[lite]'

Once installed, run npcsh to enter the NPC shell. Also provides the CLI tools npc, wander, spool, yap, and nql.

.npc and .jinx files are directly executable with shebangs (#!/usr/bin/env npc):

npc ./myagent.npc "summarize this repo"     # run an NPC with a prompt
npc ./script.jinx bash_command="ls -la"     # run a jinx directly
./myagent.npc "hello"                       # or just execute it (with shebang)

Benchmark Results

How well can a model drive npcsh as an agentic shell? 125 tasks across 15 categories — from basic shell commands to multi-step workflows, code debugging, and tool chaining — scored pass/fail. Comparisons with other agent coders coming soon.

FamilyModelScore
Qwen3.50.8b31/125 (24%)
2b81/125 (65%)
4b77/125 (62%)
9b100/125 (80%)
35b111/125 (88%)
Qwen30.6b
1.7b42/125 (34%)
4b94/125 (75%)
8b85/125 (68%)
30b103/125 (82%)
Gemma31b
4b37/125 (30%)
12b77/125 (62%)
27b73/125 (58%)
Llama3.2:1b
3.2:3b26/125 (20%)
3.1:8b60/125 (48%)
Mistralsmall3.272/125 (57%)
ministral-351/125 (40%)
Phiphi458/125 (46%)
GPT-OSS20b94/125 (75%)
OLMo27b13/125 (10%)
13b47/125 (38%)
Cogito3b10/125 (8%)
GLM4.7-flash102/125 (82%)
Gemini2.5-flash
3.1-flash
3.1-pro
Claude4.6-sonnet
4.5-haiku
GPT5-mini
DeepSeekchat
reasoner
Category breakdown (completed models)
Category Qwen3.5 Qwen3 Gemma3 Llama Mistral Phi GPT-OSS Cogito GLM
0.8b2b9b35b 1.7b4b8b30b0.6b 4b12b27b 3.2:3b small3.2ministral-3 phi4 20b 3b 4.7-flash
shell (10)56101088996696107910010
file-ops (10)891010810910691026101010010
python (10)039100566031036410010
data (10)024624561570594605
system (10)289107971059729669010
text (10)1768210673981704807
debug (10)261010042100300400909
git (10)086929984694846805
multi-step (10)067606373552305405
scripting (10)1581007260210313708
image-gen (5)555555553535512555
audio-gen (5)545555554554515555
web-search (5)154515451550450305
delegation (5)023302240200000003
tool-chain (5)154425251330010005
Total (125)3181100111429476103377773267251589410102
python -m npcsh.benchmark.local_runner --model qwen3:4b --provider ollama

Usage

  • Get help with a task:

    npcsh>can you help me identify what process is listening on port 5337?
  • Edit files:

    npcsh>please read through the markdown files in the docs folder and suggest changes
  • Search & Knowledge

    /web_search "cerulean city"            # Web search
    /db_search "query"                     # Database search
    /file_search "pattern"                 # File search
    /memories                              # Interactive memory browser TUI
    /kg                                    # Interactive knowledge graph TUI
    /nql                                   # Database query TUI

    Web search results

  • Computer Use

    /computer_use

    Plonk GUI automation TUI Plonk GUI automation — completed task

  • Generate Images

    /vixynt 'generate an image of a rabbit eating ham in the brink of dawn' model='gpt-image-1' provider='openai'

    a rabbit eating ham in the brink of dawn

  • Generate Videos

    /roll 'generate a video of a hat riding a dog' veo-3.1-fast-generate-preview  gemini

    video of a hat riding a dog

  • Multi-Agent Discussions

    /convene "Is the universe a simulation?" npcs=alicanto,corca,guac rounds=3

    Convene — multi-NPC discussion

  • Serve an NPC Team

    /serve --port 5337 --cors='http://localhost:5137/'

Agent Formats

npcsh supports multiple ways to define agents inside your npc_team/ directory. You can mix all three formats — .npc files take precedence if names collide.

.npc files — Full-featured YAML agent definitions with model, provider, jinxes, and more:

#!/usr/bin/env npc
name: analyst
primary_directive: You analyze data and provide insights.
model: qwen3:8b
provider: ollama
jinxes:
  - skills/data-analysis

agents.md — Define multiple agents in a single markdown file. Each ## heading = agent name, body = directive:

## summarizer
You summarize long documents into concise bullet points.

## fact_checker
You verify claims against reliable sources and flag inaccuracies.

agents/ directory — One .md file per agent. Filename (minus .md) = agent name. Supports YAML frontmatter:

---
model: gemini-2.5-flash
provider: gemini
---
You translate content between languages while preserving tone and idiom.

All three formats are supported by both the Python and Rust editions of npcsh. Agents from agents.md and agents/ inherit the team's default model/provider from team.ctx.

The full team structure:

npc_team/
├── team.ctx           # Team config (model, provider, forenpc, context)
├── coordinator.npc    # YAML agent definitions
├── analyst.npc
├── agents.md          # Markdown-defined agents
├── agents/            # One .md file per agent
│   └── translator.md
├── jinxes/            # Workflows and tools
│   ├── research.jinx
│   └── skills/        # Knowledge-content skills
└── tools/             # Custom tool functions

This means you can bring agents from other ecosystems — if you already have an agents.md or an agents/ directory from Claude Code, Codex, Amp, or any other tool, just drop them into your npc_team/ and npcsh will pick them up alongside your .npc files.


Launching AI Coding Tools with NPC Teams

Your npc_team/ works beyond npcsh — you can launch any major AI coding tool as an NPC from your team using the CLI launchers from npcpy. Each tool gets the NPC's persona injected and gains awareness of the other team members.

pip install npcpy   # if not already installed

# Launch Claude Code as an NPC (interactive picker)
npc-claude

# Launch as a specific NPC
npc-claude --npc corca

# Same for other coding tools
npc-codex --npc researcher
npc-gemini --npc analyst
npc-opencode --npc coder
npc-aider --npc reviewer
npc-amp --npc writer

# Point to a specific team directory
npc-claude --team ./my_project/npc_team

The launcher discovers your team from ./npc_team or ~/.npcsh/npc_team, lets you pick an NPC, and starts the tool with that NPC's directive. For Claude Code, it also passes the other NPCs as sub-agents via --agents.

For deeper integration (jinxes exposed as MCP tools, team switching mid-conversation), register the NPC plugin:

npc-plugin claude    # install MCP server + hooks
npc-plugin codex     # same for Codex
npc-plugin gemini    # same for Gemini CLI

Features

  • Agents (NPCs) — AI agents with personas, directives, and tool sets
  • Team Orchestration — Delegation, review loops, multi-NPC discussions
  • Jinxes — Jinja Execution templates — reusable tools for users and agents
  • Skills — Knowledge-content jinxes with progressive section disclosure
  • NQL — SQL models with embedded AI functions (Snowflake, BigQuery, Databricks, SQLite)
  • Knowledge Graphs — Build and evolve knowledge graphs from conversations
  • Deep Research — Multi-agent hypothesis generation, persona sub-agents, paper writing
  • Computer Use — GUI automation with vision
  • Image, Audio & Video — Generation via Ollama, diffusers, OpenAI, Gemini
  • MCP Integration — Full MCP server support with agentic shell TUI
  • API Server — Serve teams via OpenAI-compatible REST API

Works with all major LLM providers through LiteLLM: ollama, openai, anthropic, gemini, deepseek, openai-like, and more.


Installation

pip install 'npcsh[lite]'        # API providers (ollama, gemini, anthropic, openai, etc.)
pip install 'npcsh[local]'       # Local models (diffusers/transformers/torch)
pip install 'npcsh[yap]'         # Voice mode
pip install 'npcsh[all]'         # Everything
System dependencies

Linux:

sudo apt-get install espeak portaudio19-dev python3-pyaudio ffmpeg libcairo2-dev libgirepository1.0-dev
curl -fsSL https://ollama.com/install.sh | sh
ollama pull qwen3.5:2b

macOS:

brew install portaudio ffmpeg pygobject3 ollama
brew services start ollama
ollama pull qwen3.5:2b

Windows: Install Ollama and ffmpeg, then ollama pull qwen3.5:2b.

API keys go in a .env file:

export OPENAI_API_KEY="your_key"
export ANTHROPIC_API_KEY="your_key"
export GEMINI_API_KEY="your_key"

Rust Edition (experimental)

A native Rust build of npcsh is available — same shell, same DB, same team files, faster startup. Still experimental.

cd npcsh/rust && cargo build --release
cp target/release/npcsh ~/.local/bin/npc   # or wherever you want

Both editions share ~/npcsh_history.db and ~/.npcsh/npc_team/ and can be used interchangeably.

Read the Docs

Full documentation, guides, and API reference at npc-shell.readthedocs.io.

Links

Research

  • Quantum-like nature of natural language interpretation: arxiv, accepted at QNLP 2025
  • Simulating hormonal cycles for AI: arxiv

Community & Support

Discord | Monthly donation | Merch | Consulting: [email protected]

Contributing

Contributions welcome! Submit issues and pull requests on the GitHub repository.

License

MIT License.

Star History

Star History Chart