Codebase Memory MCP — Code Intelligence for AI Agents
High-performance code intelligence MCP server. Indexes repos in milliseconds via tree-sitter AST, supports 66 languages, sub-ms graph queries. MIT, 1,300+ stars.
What it is
Codebase Memory MCP is a high-performance code intelligence server built on the Model Context Protocol. It indexes entire repositories in milliseconds using tree-sitter AST parsing and supports 66 programming languages. AI agents query the indexed graph with sub-millisecond response times.
It targets developers using AI coding assistants like Claude Code, Cursor, or Windsurf who need their agent to understand the full structure of a codebase — functions, classes, imports, and call graphs — without manually feeding context.
How it saves time or tokens
Without a code intelligence server, AI agents must read files sequentially to understand project structure, consuming tokens on boilerplate and irrelevant code. Codebase Memory MCP pre-indexes the AST so the agent queries only what it needs. This reduces context window usage and speeds up responses.
How to use
- Add the MCP server to your
.mcp.jsonconfiguration:
{
"mcpServers": {
"codebase-memory-mcp": {
"command": "npx",
"args": ["-y", "codebase-memory-mcp@latest"]
}
}
}
- Restart your AI editor (Claude Code, Cursor, or Windsurf).
- The server indexes your project automatically and responds to structural queries from the agent.
Example
{
"mcpServers": {
"codebase-memory-mcp": {
"command": "npx",
"args": ["-y", "codebase-memory-mcp@latest"]
}
}
}
Once configured, your AI agent can query the codebase graph for function definitions, class hierarchies, and import chains without reading every file.
Related on TokRepo
- MCP Integrations — More MCP servers for AI-assisted development
- AI Memory Providers — Memory solutions that complement code intelligence
Key considerations
When evaluating Codebase Memory MCP for your workflow, consider the following factors. First, assess whether your team has the technical prerequisites to adopt this tool effectively. Second, evaluate the maintenance burden against the productivity gains. Third, check community activity and documentation quality to ensure long-term viability. Integration with your existing toolchain matters more than feature count alone. Start with a small pilot project before rolling out across the organization. Monitor resource usage during the initial adoption phase to identify bottlenecks early. Document your configuration decisions so team members can onboard independently.
Common pitfalls
- Very large monorepos (100K+ files) may take longer for the initial index; subsequent queries remain fast.
- Languages not in the 66 supported by tree-sitter will be skipped during indexing.
- The MCP server runs as a local process; it does not send code to external servers, but it does consume local CPU and memory during indexing.
Frequently Asked Questions
Any editor or tool that supports the Model Context Protocol can use it. This includes Claude Code, Cursor, Windsurf, and other MCP-compatible clients. Configuration is done via a JSON file.
No. The MCP server runs entirely locally. It indexes your codebase on your machine and serves queries locally. No code leaves your environment.
It supports 66 programming languages through tree-sitter parsers. This covers most mainstream languages including Python, JavaScript, TypeScript, Go, Rust, Java, C, C++, and many more.
The README states indexing happens in milliseconds for typical repositories. Very large monorepos may take longer, but the indexed graph is cached for subsequent queries.
The server exposes an MCP interface. While designed for AI editors, any MCP client can connect to it. You could build custom tooling that queries the code graph directly.
Citations (3)
- Codebase Memory MCP GitHub— Indexes repos in milliseconds via tree-sitter AST, supports 66 languages
- Codebase Memory MCP README— Sub-ms graph queries for code intelligence
- MCP Official Docs— Model Context Protocol specification
Related on TokRepo
Source & Thanks
Created by DeusData. Licensed under MIT.
codebase-memory-mcp — ⭐ 1,300+
Thanks to the DeusData team for building the fastest code intelligence engine for MCP.