MCP ConfigsApr 6, 2026·2 min read

Codebase Memory MCP — Code Knowledge Graph Server

High-performance MCP server that indexes codebases into persistent knowledge graphs. Supports 66 languages, sub-millisecond queries, and claims 99% fewer tokens than raw file context. 1,100+ stars.

TL;DR
MCP server that builds a knowledge graph from your codebase for fast, token-efficient AI queries.
§01

What it is

Codebase Memory MCP is a high-performance MCP server that indexes your codebase into a persistent knowledge graph. It supports 66 programming languages, provides sub-millisecond queries, and reduces the tokens needed for code context by replacing raw file content with structured graph lookups. The server runs locally and integrates with AI assistants through the Model Context Protocol.

The project targets developers using AI coding assistants who want better code understanding without sending entire files as context. By pre-indexing the codebase into a graph of functions, classes, imports, and dependencies, the MCP server provides precise answers to structural queries.

§02

How it saves time or tokens

Sending raw files to an AI assistant wastes tokens on irrelevant code. Codebase Memory MCP indexes your codebase once and serves targeted queries: 'what calls this function,' 'what does this class depend on,' 'show me the type hierarchy.' Each query returns only the relevant graph nodes, potentially using 99% fewer tokens than raw file context. The persistent index survives restarts, so re-indexing only processes changed files.

§03

How to use

  1. Add to your MCP configuration:
{
  "mcpServers": {
    "codebase-memory": {
      "command": "npx",
      "args": ["-y", "codebase-memory-mcp"]
    }
  }
}
  1. Restart Claude Code. The server indexes your codebase automatically on first use.
  1. Ask Claude structural questions about your codebase, and it will use the knowledge graph for precise answers.
§04

Example

Queries the knowledge graph can answer efficiently:

# Instead of reading entire files, the MCP server returns graph data:

> What functions call processPayment()?
  - checkout.ts:handleCheckout (line 45)
  - subscription.ts:renewSubscription (line 112)
  - admin.ts:manualCharge (line 78)

> What are the dependencies of the AuthService class?
  - UserRepository (import from ./repositories)
  - TokenManager (import from ./utils)
  - Logger (import from @/shared)

> Show the type hierarchy for BaseController
  - BaseController
    - UserController
    - ProductController
    - OrderController
§05

Related on TokRepo

§06

Common pitfalls

  • Initial indexing of large codebases (100K+ files) can take several minutes; subsequent incremental updates are faster
  • The knowledge graph is local to your machine; team members each maintain their own index
  • Language support varies in depth; popular languages (TypeScript, Python, Go) have richer graph structures than niche languages

Frequently Asked Questions

How many programming languages does it support?+

Codebase Memory MCP supports 66 programming languages through tree-sitter parsers. Coverage depth varies: mainstream languages like TypeScript, Python, Go, and Java have full function/class/import extraction, while less common languages may have basic file-level indexing.

Does it work with Claude Code?+

Yes. Add the MCP server to your .mcp.json configuration and restart Claude Code. The server integrates through the standard Model Context Protocol, providing code structure tools that Claude can call during conversations.

How much disk space does the index use?+

The knowledge graph is typically 5-15% of the source code size. A 100MB codebase produces roughly a 10MB index. The index is stored locally and persists across sessions.

Does it index binary files or node_modules?+

By default, the server skips binary files, node_modules, vendor directories, and other common non-source paths. You can customize the ignore patterns in the configuration.

How does incremental indexing work?+

After the initial full index, the server watches for file changes and only re-indexes modified files. This keeps the knowledge graph current without re-processing the entire codebase.

Citations (3)
🙏

Source & Thanks

Created by DeusData. Licensed under Apache 2.0.

codebase-memory-mcp — ⭐ 1,100+

Thanks to DeusData for making large codebase navigation effortless for AI agents.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.