Why Sediment
Built different.
No services. No config. No API keys. Just one binary that gives your AI agents persistent memory.
Zero dependencies
No Docker. No Python. No cloud. One Rust binary with
local embeddings. macOS (Intel + ARM) and Linux. All
your data stays in ~/.sediment/.
Intelligent recall
Semantic search with memory decay, trust scoring, relationship graph, and auto-consolidation. Not just vector search.
MCP native
Works with Claude Code, Claude Desktop, Cursor, VS Code Copilot, Windsurf, JetBrains — any MCP client.
Comparison
The simplest path.
Other memory servers need Docker, Python, or cloud APIs. Sediment is a single binary.
Sediment
Single Rust binary, zero config
- Single binary install
- Zero dependencies
- 4 minimal MCP tools
- Local embeddings
- Relationship graph
- Memory decay & trust scoring
OpenMemory MCP
Mem0's local MCP server
- Docker + Postgres + Qdrant
- 3 services required
- 10+ MCP tools
- API-dependent embeddings
- No relationship graph
- No memory decay
mcp-memory-service
Python MCP memory server
- Python + pip install
- Python runtime + dependencies
- 12 MCP tools
- API-dependent embeddings
- No relationship graph
- No memory decay
Benchmarks
Tested, not guessed.
1,000 developer memories, 200 search queries. Benchmarked against 5 alternatives.
| Sediment | ChromaDB | Mem0 | |
|---|---|---|---|
| Recall@1 | 50.0% | 47.0% | 47.0% |
| Recall@3 | 69.0% | 69.0% | 69.0% |
| Recall@5 | 77.5% | 78.5% | 78.5% |
| Recall@10 | 89.5% | 90.0% | 90.0% |
| MRR | 61.9% | 60.8% | 60.8% |
| Recency@1 | 100.0% | 14.0% | 14.0% |
| Consolidation rate | 99% | 0% | 0% |
| Store p50 | 50ms | 692ms | 14ms |
Apple M3 Max, 36GB RAM. 1,000 memories, 200 queries. Full methodology
API
4 tools. That's it.
A minimal API that LLMs can actually use well. Just 8 parameters total.
store
Save content to memory. Just pass content, optionally scope. Use replace_id to atomically update an existing item.
recall
Semantic search. Pass query, optionally limit.
list
Browse stored items. Optional limit and scope.
forget
Delete an item by id.
Internals
Under the hood.
Built on proven foundations with intelligent scoring and automatic organization.
Two-database hybrid
LanceDB for vectors, SQLite for the relationship graph and access tracking. All embedded, zero config.
Local embeddings
all-MiniLM-L6-v2 via Candle. 384-dim vectors, no API keys, no network calls.
Memory decay
30-day half-life freshness scoring combined with log-scaled access frequency. Old memories rank lower but are never deleted.
Trust-weighted scoring
Validated and well-connected memories score higher. The more you use a memory, the more trustworthy it becomes.
Auto-consolidation
Near-duplicates auto-merged. Similar items linked. Runs in the background, non-blocking.
Project scoping
Automatic context isolation per project. Same-project items boosted, cross-project results flagged.
Type-aware chunking
Intelligent splitting for markdown, code, JSON, YAML, and plain text. Long content is chunked with individual embeddings.
Co-access patterns
Tracks which memories are accessed together. Frequently co-accessed items surface automatically in future recalls.
Cross-project recall
Results from other projects are surfaced and flagged with provenance metadata. Knowledge flows across your work.
Hybrid search
Vector similarity combined with BM25 full-text scoring. Best of both worlds for retrieval quality.
Quick start
Two steps.
Install the binary and add it to your MCP client config. That's it.
Add to your MCP client
{
"mcpServers": {
"sediment": {
"command": "sediment"
}
}
} Works with Claude Code, Claude Desktop, Cursor, VS Code, Windsurf, JetBrains
CLI
Terminal included.
Manage your memory from the command line.