#!/usr/bin/env python3 """ MCP-Use — Build Full-Stack MCP Apps
Full-stack framework for developing MCP applications that work with ChatGPT, Claude, and other AI agents. 9.6K+ GitHub stars.
Quick start: pip install mcp-use
GitHub: https://github.com/mcp-use/mcp-use License: MIT """
=== Quick Use ===
pip install mcp-use
from mcp_use import MCPClient, MCPAgent from langchain_openai import ChatOpenAI
Connect to any MCP server
client = MCPClient.from_config({ "mcpServers": { "filesystem": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"] } } })
Create an agent that can use MCP tools
llm = ChatOpenAI(model="gpt-4o") agent = MCPAgent(llm=llm, client=client)
The agent can now use any tool from the connected MCP servers
result = agent.run("List all files in /tmp and summarize their contents") print(result)
=== Key Features ===
- Connect to multiple MCP servers simultaneously
- Works with any LLM (OpenAI, Anthropic, local models)
- LangChain integration for complex chains
- Automatic tool discovery and execution
- Streaming support
- Error handling and retry logic
- TypeScript and Python SDKs
=== Multi-Server Example ===
multi_config = { "mcpServers": { "filesystem": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "."] }, "github": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-github"], "env": {"GITHUB_TOKEN": "ghp_..."} } } }
client = MCPClient.from_config(multi_config) agent = MCPAgent(llm=llm, client=client)
Agent can use tools from BOTH servers
result = agent.run("Read my README.md and create a GitHub issue summarizing it") print(result)