mcp-use — Fullstack MCP Framework for AI Agents & Apps
Build MCP servers and apps for ChatGPT, Claude, and any LLM with TypeScript or Python SDK. Includes inspector, cloud deploy, and interactive widgets.
What it is
mcp-use is a fullstack MCP (Model Context Protocol) framework for building servers and applications that work with ChatGPT, Claude, and any LLM that supports MCP. It provides both TypeScript and Python SDKs, an interactive inspector for debugging, cloud deployment tooling, and UI widgets for building agent interfaces.
The framework targets developers building MCP-powered tools who want a structured approach rather than wiring raw protocol handlers manually. It handles transport, serialization, tool registration, and client-side rendering so you can focus on the business logic of your MCP tools.
How it saves time or tokens
Building an MCP server from scratch requires understanding the protocol specification, implementing JSON-RPC transport, handling tool schemas, and managing lifecycle events. mcp-use abstracts these layers into a declarative SDK where you define tools as functions and the framework handles everything else. The inspector lets you test tools interactively before connecting to an LLM, catching issues early. Cloud deploy packages your server as a container with a single command.
How to use
- Create a new MCP app with TypeScript:
npx create-mcp-use-app@latest
Or install the Python SDK:
pip install mcp-use
- Define a tool in your server:
import { McpServer } from 'mcp-use';
const server = new McpServer({ name: 'my-tools' });
server.tool('get_weather', {
description: 'Get current weather for a city',
parameters: { city: { type: 'string' } },
handler: async ({ city }) => {
const data = await fetchWeather(city);
return { temperature: data.temp, condition: data.condition };
}
});
server.start();
- Test with the inspector, then add to your AI assistant's MCP config.
Example
Python SDK example:
from mcp_use import McpServer
server = McpServer(name='my-tools')
@server.tool(description='Search documents by query')
def search_docs(query: str, limit: int = 10):
results = db.search(query, limit=limit)
return [{'title': r.title, 'snippet': r.snippet} for r in results]
server.start()
Related on TokRepo
- MCP Chrome Integration — browser automation through MCP
- MCP GitHub Integration — GitHub access via MCP protocol
Common pitfalls
- The TypeScript and Python SDKs have slightly different API surfaces; do not assume code translates directly between them
- Cloud deploy requires Docker; ensure your tool dependencies are included in the container build
- Interactive widgets are client-side only and do not work in headless MCP connections from CLI tools like Claude Code
Frequently Asked Questions
MCP (Model Context Protocol) is a standard protocol for connecting AI assistants to external tools and data sources. It defines how an LLM discovers, calls, and receives results from tools. mcp-use makes building MCP-compatible tools straightforward.
Any LLM client that supports MCP can connect to mcp-use servers. This includes Claude (via Claude Code and Claude Desktop), ChatGPT (with MCP plugin support), and custom clients using the MCP SDK.
Yes. mcp-use includes cloud deployment tooling that packages your server as a Docker container. You can deploy to any container hosting platform including AWS ECS, Google Cloud Run, or Railway.
The inspector provides a web UI where you can call your MCP tools interactively, see the request/response payloads, and debug issues without connecting an LLM. It runs locally alongside your development server.
mcp-use implements the MCP specification and is interoperable with servers and clients built using the official Anthropic MCP SDK. You can mix mcp-use servers with official SDK clients and vice versa.
Citations (3)
- mcp-use GitHub— mcp-use fullstack MCP framework
- MCP Specification— Model Context Protocol specification
- Anthropic Docs— Anthropic MCP documentation