MCP ConfigsApr 6, 2026·2 min read

Codebase Memory MCP — Code Knowledge Graph Server

High-performance MCP server that indexes codebases into persistent knowledge graphs. Supports 66 languages, sub-millisecond queries, and claims 99% fewer tokens than raw file context. 1,100+ stars.

MC
MCP Hub · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

Add to your .mcp.json:

{
  "mcpServers": {
    "codebase-memory": {
      "command": "npx",
      "args": ["-y", "codebase-memory-mcp"]
    }
  }
}

Restart Claude Code. The server automatically indexes your project on first use. Ask: "What does the authentication module do?" or "Find all API endpoints."


Intro

Codebase Memory MCP is a high-performance Model Context Protocol server that indexes entire codebases into a persistent knowledge graph with 1,100+ GitHub stars. It supports 66 programming languages, delivers sub-millisecond queries, and uses 99% fewer tokens compared to dumping raw files into context. Ships as a single static binary with zero dependencies. Best for developers working on large codebases who need their AI agent to understand project architecture without burning the context window. Works with: Claude Code, Cursor, Windsurf, any MCP client. Setup time: under 1 minute.


How It Works

Automatic Indexing

On first connection, Codebase Memory scans your project and builds a knowledge graph of:

  • File structure and dependencies
  • Function/class definitions and relationships
  • Import chains and module boundaries
  • Code patterns and conventions

Smart Context Retrieval

Instead of dumping entire files into the LLM context, Codebase Memory returns precisely the relevant code snippets:

User: "How does the payment processing work?"
Agent: [queries knowledge graph] → returns only payment-related functions,
       their dependencies, and data flow — not entire files

66 Language Support

Built on Tree-sitter parsers for accurate AST analysis across languages: TypeScript, Python, Go, Rust, Java, C++, Ruby, PHP, Swift, Kotlin, and 56 more.

Persistent Graph

The knowledge graph persists between sessions. Re-indexing only processes changed files:

First index: ~30 seconds for a 100K LOC project
Subsequent: <1 second (incremental)

Query Capabilities

"What functions call the createUser method?"
"Show the data flow from API request to database"
"Find all error handling patterns in the codebase"
"What are the dependencies of the auth module?"

Key Stats

  • 1,100+ GitHub stars
  • 66 programming languages supported
  • Sub-millisecond query response
  • 99% token reduction vs raw file context
  • Single static binary, zero dependencies
  • ~210K total downloads

FAQ

Q: What is Codebase Memory MCP? A: It is an MCP server that indexes your codebase into a persistent knowledge graph, letting AI agents understand project architecture and retrieve precise code context without reading entire files.

Q: Is Codebase Memory MCP free? A: Yes, open-source under Apache 2.0 license. All indexing runs locally.

Q: How much faster is it than raw file context? A: It uses 99% fewer tokens by returning only relevant code snippets instead of entire files. Queries return in sub-millisecond time.


🙏

Source & Thanks

Created by DeusData. Licensed under Apache 2.0.

codebase-memory-mcp — ⭐ 1,100+

Thanks to DeusData for making large codebase navigation effortless for AI agents.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets