ScriptsMar 31, 2026·2 min read

mcp-agent — Build AI Agents with MCP Patterns

mcp-agent is a Python framework for building AI agents using the Model Context Protocol. 8.2K+ GitHub stars. Implements composable workflow patterns (orchestrator, map-reduce, evaluator-optimizer, rou

TO
TokRepo精选 · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

# Install with uv (recommended)
uv add "mcp-agent[anthropic]"

# Or with multiple providers
uv add "mcp-agent[openai, anthropic, google, azure, bedrock]"

# Or with pip
pip install "mcp-agent[anthropic]"

Intro

mcp-agent is a Python framework for building effective AI agents using the Model Context Protocol (MCP). With 8,200+ GitHub stars, it implements Anthropic's Building Effective Agents patterns in composable, reusable components. Connect any LLM to MCP servers (filesystem, databases, Slack, Jira, GitHub) without writing custom adapters. Supports orchestrator, map-reduce, evaluator-optimizer, and router workflow patterns out of the box.

Best for: Developers building multi-step AI agent workflows with MCP tool integration Works with: Claude Code, OpenAI Codex, Cursor, Gemini CLI, Windsurf Providers: OpenAI (GPT-4o), Anthropic Claude, Google, Azure OpenAI, AWS Bedrock


Key Features

  • Full MCP support: Tools, resources, prompts, notifications, OAuth, and sampling
  • Composable patterns: Orchestrator-workers, parallel map-reduce, evaluator-optimizer loops, intent routers
  • Multi-provider: OpenAI, Anthropic, Google, Azure, Bedrock with unified interface
  • Durable execution: Temporal backend for reliable long-running workflows
  • Agent-as-MCP-server: Expose your agents as MCP servers for other agents to call
  • Structured logging: Built-in token accounting and observability
  • Cloud deployment: Deploy agents to production with scaling support

Example

from mcp_agent.app import MCPApp
from mcp_agent.agents.agent import Agent
from mcp_agent.workflows.llm.augmented_llm import RequestParams

app = MCPApp(name="my-agent")

async with app.run() as ctx:
    agent = Agent(
        name="researcher",
        instruction="You are a research assistant.",
        server_names=["fetch", "filesystem"],
    )
    async with agent:
        llm = await agent.attach_llm("anthropic")
        result = await llm.generate_str(
            "Summarize the latest AI news",
            request_params=RequestParams(model="claude-sonnet-4-5-20250514"),
        )
        print(result)

FAQ

Q: What is mcp-agent? A: mcp-agent is a Python framework with 8.2K+ stars for building AI agents using the Model Context Protocol. It provides composable workflow patterns like orchestrator, map-reduce, and evaluator-optimizer, supporting 5 LLM providers.

Q: How do I install mcp-agent? A: Run uv add "mcp-agent[anthropic]" or pip install "mcp-agent[anthropic]". Add provider extras like [openai, google, azure, bedrock] as needed.

Q: Which LLM providers does mcp-agent support? A: OpenAI (GPT-4o default), Anthropic Claude, Google, Azure OpenAI, and AWS Bedrock — all through a unified interface.


🙏

Source & Thanks

Created by LastMile AI. Licensed under MIT. lastmile-ai/mcp-agent — 8,200+ GitHub stars

Related Assets