MCP ConfigsApr 9, 2026·3 min read

Codebase Memory MCP — Code Intelligence for AI Agents

High-performance code intelligence MCP server. Indexes repos in milliseconds via tree-sitter AST, supports 66 languages, sub-ms graph queries. MIT, 1,300+ stars.

TL;DR
An MCP server that indexes codebases via tree-sitter AST, supporting 66 languages with sub-millisecond graph queries.
§01

What it is

Codebase Memory MCP is a high-performance code intelligence server built on the Model Context Protocol. It indexes entire repositories in milliseconds using tree-sitter AST parsing and supports 66 programming languages. AI agents query the indexed graph with sub-millisecond response times.

It targets developers using AI coding assistants like Claude Code, Cursor, or Windsurf who need their agent to understand the full structure of a codebase — functions, classes, imports, and call graphs — without manually feeding context.

§02

How it saves time or tokens

Without a code intelligence server, AI agents must read files sequentially to understand project structure, consuming tokens on boilerplate and irrelevant code. Codebase Memory MCP pre-indexes the AST so the agent queries only what it needs. This reduces context window usage and speeds up responses.

§03

How to use

  1. Add the MCP server to your .mcp.json configuration:
{
  "mcpServers": {
    "codebase-memory-mcp": {
      "command": "npx",
      "args": ["-y", "codebase-memory-mcp@latest"]
    }
  }
}
  1. Restart your AI editor (Claude Code, Cursor, or Windsurf).
  2. The server indexes your project automatically and responds to structural queries from the agent.
§04

Example

{
  "mcpServers": {
    "codebase-memory-mcp": {
      "command": "npx",
      "args": ["-y", "codebase-memory-mcp@latest"]
    }
  }
}

Once configured, your AI agent can query the codebase graph for function definitions, class hierarchies, and import chains without reading every file.

§05

Related on TokRepo

Key considerations

When evaluating Codebase Memory MCP for your workflow, consider the following factors. First, assess whether your team has the technical prerequisites to adopt this tool effectively. Second, evaluate the maintenance burden against the productivity gains. Third, check community activity and documentation quality to ensure long-term viability. Integration with your existing toolchain matters more than feature count alone. Start with a small pilot project before rolling out across the organization. Monitor resource usage during the initial adoption phase to identify bottlenecks early. Document your configuration decisions so team members can onboard independently.

§06

Common pitfalls

  • Very large monorepos (100K+ files) may take longer for the initial index; subsequent queries remain fast.
  • Languages not in the 66 supported by tree-sitter will be skipped during indexing.
  • The MCP server runs as a local process; it does not send code to external servers, but it does consume local CPU and memory during indexing.

Frequently Asked Questions

Which AI editors support Codebase Memory MCP?+

Any editor or tool that supports the Model Context Protocol can use it. This includes Claude Code, Cursor, Windsurf, and other MCP-compatible clients. Configuration is done via a JSON file.

Does it send my code to external servers?+

No. The MCP server runs entirely locally. It indexes your codebase on your machine and serves queries locally. No code leaves your environment.

How many languages does it support?+

It supports 66 programming languages through tree-sitter parsers. This covers most mainstream languages including Python, JavaScript, TypeScript, Go, Rust, Java, C, C++, and many more.

How fast is the initial indexing?+

The README states indexing happens in milliseconds for typical repositories. Very large monorepos may take longer, but the indexed graph is cached for subsequent queries.

Can I use it without an AI editor?+

The server exposes an MCP interface. While designed for AI editors, any MCP client can connect to it. You could build custom tooling that queries the code graph directly.

Citations (3)
🙏

Source & Thanks

Created by DeusData. Licensed under MIT.

codebase-memory-mcp — ⭐ 1,300+

Thanks to the DeusData team for building the fastest code intelligence engine for MCP.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.