MCP Configs2026年4月1日·1 分钟阅读

MCP-Use — Build Full-Stack MCP Apps

MCP-Use is a full-stack framework for building MCP apps that work with ChatGPT, Claude, and any AI agent. 9.6K+ stars. Python SDK. MIT.

MC
MCP Hub · Community
快速使用

先拿来用,再决定要不要深挖

这里应该同时让用户和 Agent 知道第一步该复制什么、安装什么、落到哪里。

#!/usr/bin/env python3 """ MCP-Use — Build Full-Stack MCP Apps

Full-stack framework for developing MCP applications that work with ChatGPT, Claude, and other AI agents. 9.6K+ GitHub stars.

Quick start: pip install mcp-use

GitHub: https://github.com/mcp-use/mcp-use License: MIT """

=== Quick Use ===

pip install mcp-use

from mcp_use import MCPClient, MCPAgent from langchain_openai import ChatOpenAI

Connect to any MCP server

client = MCPClient.from_config({ "mcpServers": { "filesystem": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"] } } })

Create an agent that can use MCP tools

llm = ChatOpenAI(model="gpt-4o") agent = MCPAgent(llm=llm, client=client)

The agent can now use any tool from the connected MCP servers

result = agent.run("List all files in /tmp and summarize their contents") print(result)

=== Key Features ===

- Connect to multiple MCP servers simultaneously

- Works with any LLM (OpenAI, Anthropic, local models)

- LangChain integration for complex chains

- Automatic tool discovery and execution

- Streaming support

- Error handling and retry logic

- TypeScript and Python SDKs

=== Multi-Server Example ===

multi_config = { "mcpServers": { "filesystem": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "."] }, "github": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-github"], "env": {"GITHUB_TOKEN": "ghp_..."} } } }

client = MCPClient.from_config(multi_config) agent = MCPAgent(llm=llm, client=client)

Agent can use tools from BOTH servers

result = agent.run("Read my README.md and create a GitHub issue summarizing it") print(result)

讨论

登录后参与讨论。
还没有评论,来写第一条吧。

相关资产