WabiSabi is a terminal-based AI coding assistant that connects to Substratum backend or runs with local Ollama.
- 🧠 Intelligent Agents: Build, Plan, and Search agents
- 🔗 Substratum Ready: Connect to central backend for models
- 🏠 Local Mode: Run with local Ollama installation
- 📦 Plugins: Compatible with Claude Code and OpenCode plugins
# Clone and setup
git clone https://github.com/ascendwave/wabisabi
cd wabisabi/packages/terminal
# Install dependencies
bun install
# Build
bun build
# Run CLI
./dist/index.js --helpwabisabi interactive # Start interactive mode
wabisabi batch <file> # Run batch tasks from JSON file
wabisabi stream # Streaming mode
wabisabi agent build # Code generation agent
wabisabi agent plan # Task planning agent
wabisabi agent search # Research agentSet environment variables or use CLI options:
# CLI options
./dist/index.js --substratum http://localhost:3001 --model llama3.2
# Environment variables
export WABISABI_SUBSTRATUM=http://localhost:3001
export WABISABI_OLLAMA=http://localhost:11434
export WABISABI_MODEL=llama3.2Generates complete, working code based on your requirements.
Creates detailed task plans with steps, dependencies, and risks.
Researches topics and provides comprehensive information.
A simple web interface is available in wabisabi-web-next/:
cd wabisabi-web-next
bun install
bun devwabisabi/
├── packages/terminal/ # CLI Tool
│ ├── src/
│ │ ├── cli/ # Entry point
│ │ ├── modes/ # interactive, batch, streaming
│ │ ├── clients/ # API clients
│ │ ├── agents/ # Build, Plan, Search agents
│ │ └── plugins/ # Plugin system
│ └── dist/ # Built binary
├── wabisabi-web-next/ # Web interface
├── BASE/ # Brand assets
└── README.md
MIT
Built with 🌀 by Arkessiah