What is Graph Memory?
Graph Memory is an MCP (Model Context Protocol) server that turns any project directory into a queryable semantic knowledge base. It indexes your markdown docs, source code, and files into interconnected graphs, then exposes them as 70 MCP tools and a full-featured web UI.

Who is it for?
- Developers who want their AI assistant (Claude, Cursor, Windsurf) to deeply understand their codebase
- Teams that need a persistent knowledge base that AI assistants can read and write to
- Anyone tired of AI losing context between conversations
What it does
| Feature | Description |
|---|---|
| Docs indexing | Parses markdown into heading-based chunks with cross-file links and code block extraction |
| Code indexing | Extracts functions, classes, interfaces via tree-sitter AST parsing (TypeScript/JavaScript) |
| File index | Indexes all project files with metadata, language detection, directory hierarchy |
| Knowledge graph | Persistent notes and facts with typed relations and cross-graph links |
| Task management | Kanban workflow with priorities, assignees, due dates, and cross-graph context |
| Skills | Reusable recipes with steps, triggers, and usage tracking |
| Hybrid search | BM25 keyword + vector cosine similarity with graph expansion |
| Real-time | File watching + WebSocket push to UI |
| Multi-project | One process manages multiple projects from a single config |
| Web UI | Dashboard, kanban board, code browser, search, prompt builder |
How it works
Your Project → Graph Memory → AI Assistant
│ │ │
files, graphs, 70 MCP tools
docs, embeddings, for search,
code web UI CRUD, linking
- Point Graph Memory at your project directory
- It indexes docs, code, and files into interconnected graphs
- It embeds every node locally using an embedding model (~560 MB, no API calls)
- AI assistants query the graphs through 70 MCP tools
- You manage knowledge, tasks, and skills through MCP tools or the web UI
- File mirror syncs notes/tasks/skills to
.notes/,.tasks/,.skills/folders for IDE editing
Key principles
- Everything is local — embeddings run on your machine, no data leaves your network
- Zero config to start —
npm install -g @graphmemory/server && graphmemory serve - Graphs, not chunks — structured graphs with relationships, not flat vector stores
- AI-native — designed for MCP clients, not just humans
- Multi-project — one server, many projects, shared workspaces