deja
Persistent memory for AI agents

deja

Agents lose context between runs. deja gives them a durable place to store and retrieve what matters.

A memory layer. Agents call learn() to store what they know and inject() or recall() to get it back.

Three packages — local, edge, hosted — same idea, pick your runtime.

Three memory systems

Same concepts, different runtimes. Pick the one that matches your stack.

In-process vector memory for Bun
SEARCH Vector similarity (all-MiniLM-L6-v2 via ONNX)
STORAGE SQLite file on disk
LATENCY Zero — runs in your process
DEPS Bun runtime + HuggingFace transformers

Best for: CLI tools, local agents, scripts, development.

import DejaLocal from 'deja-local'
const mem = new DejaLocal('./memories.db') await mem.remember("node 20 breaks esbuild") const hits = await mem.recall("deploying")
deja-local deja-edge deja (hosted)
Runtime Bun Cloudflare DO Cloudflare Workers
Search Cosine similarity FTS5 / BM25 Vectorize semantic
Embeddings all-MiniLM-L6-v2 None (text-based) bge-small-en-v1.5
Latency Zero (in-process) Zero (in-DO) Network hop
Multi-agent Single process Per-DO isolation Scoped sharing
MCP Built-in
Install npm i deja-local npm i deja-edge npm i deja-client
How it works
1
Agent learns

After a run, the agent stores what it discovered. deja embeds the text, checks for duplicates, and assigns a confidence score.

2
Next agent recalls

Before acting, the next agent describes what it's about to do. deja returns the most relevant memories — already formatted for injection into the prompt.

3
Memory evolves

Useful memories gain confidence. Contradicted ones get superseded. Stale ones decay. Cleanup and feedback help keep the store usable over time.

Stop re-explaining. Start remembering.

Pick a package, add two function calls, and your agents will never forget again.

Docs · Integrations · Patterns · Research