# OpenRouter MCP — One Server for 300+ LLMs in Claude Code > OpenRouter MCP exposes all 300+ OpenRouter models to Claude Code, Cursor, Codex CLI as one MCP server. Switch models per task, BYO routing, no extra SDKs. ## Install Merge the JSON below into your `.mcp.json`: ## Quick Use 1. Sign up at openrouter.ai → API key (starts `sk-or-v1-`) 2. Add the JSON snippet below to your MCP config 3. Restart your MCP host — `openrouter_chat` and friends are now available --- ## Intro OpenRouter MCP wraps OpenRouter's universal LLM gateway as a Model Context Protocol server. Inside Claude Code, Cursor, or Codex CLI, you can call `openrouter_chat` and pick any of 300+ models per task — switch from Claude to GPT-4o to Llama 3.3 mid-session without changing your host. Best for: agents that want to delegate sub-tasks to cheaper or specialized models. Works with: any MCP host. Setup time: 3 minutes. --- ### MCP config ```json { "mcpServers": { "openrouter": { "command": "npx", "args": ["-y", "openrouter-mcp"], "env": { "OPENROUTER_API_KEY": "sk-or-v1-...", "OPENROUTER_DEFAULT_MODEL": "anthropic/claude-3.5-haiku" } } } } ``` ### Tools exposed | Tool | Use | |---|---| | `openrouter_chat` | Send a chat completion via OpenRouter | | `openrouter_list_models` | Filter the 300+ catalog by capability / price / context | | `openrouter_get_model` | Cost, context, capabilities for one model | | `openrouter_set_default` | Switch the default model for this session | ### Common patterns ``` > use the openrouter_chat tool with model "deepseek/deepseek-chat" to summarize this 50-page PDF (cheap model for bulk text) > use openrouter_list_models to find models with <$0.50/1M input cost and >100K context window > use openrouter_chat with model "openai/o1" for the hardest reasoning step, then back to claude-haiku for the boilerplate ``` ### Why use OpenRouter MCP vs direct provider MCPs A separate MCP per provider (Anthropic MCP, OpenAI MCP, …) means N config blocks, N keys, N tool prefixes. OpenRouter MCP gives you one config / one key / one tool prefix for all of them. Trade-off: you depend on OpenRouter's availability for everything (their uptime is good but it's a single point of failure). --- ### FAQ **Q: Is OpenRouter MCP official?** A: OpenRouter has multiple community-built MCP wrappers (search npm for 'openrouter-mcp'). They all use OpenRouter's REST API. The most active is `openrouter-mcp` by the community. OpenRouter itself doesn't ship an official one yet. **Q: Can I cap spend per MCP call?** A: Yes — set `max_tokens` in each call, and use OpenRouter's account-level spend limit. Combined with PostHog observability you can also enforce per-feature budgets. **Q: Does this work for non-coding agents?** A: Yes — OpenRouter's tools work for any text task. The MCP just exposes them to MCP hosts. Same patterns work for chat apps, content generation, data extraction agents. --- ## Source & Thanks > Built by community / [OpenRouter](https://github.com/OpenRouterTeam). MIT-licensed wrappers. > > [openrouter.ai](https://openrouter.ai) — OpenRouter platform --- ## 快速使用 1. 在 openrouter.ai 注册,拿 API key(`sk-or-v1-` 开头) 2. 把下面的 JSON 加到 MCP 配置 3. 重启 MCP 宿主,`openrouter_chat` 等工具立刻可用 --- ## 简介 OpenRouter MCP 把 OpenRouter 通用 LLM 网关包成 Model Context Protocol server。在 Claude Code / Cursor / Codex CLI 里调 `openrouter_chat` 按任务挑 300+ 模型 —— 会话中途从 Claude 切 GPT-4o 切 Llama 3.3,不用换宿主。适合想把子任务委派给更便宜或专用模型的 agent。兼容任何 MCP 宿主。装机时间 3 分钟。 --- ### MCP 配置 ```json { "mcpServers": { "openrouter": { "command": "npx", "args": ["-y", "openrouter-mcp"], "env": { "OPENROUTER_API_KEY": "sk-or-v1-...", "OPENROUTER_DEFAULT_MODEL": "anthropic/claude-3.5-haiku" } } } } ``` ### 暴露的工具 | 工具 | 用途 | |---|---| | `openrouter_chat` | 通过 OpenRouter 发起 chat completion | | `openrouter_list_models` | 按能力 / 价格 / 上下文过滤 300+ 目录 | | `openrouter_get_model` | 单个模型的成本、上下文、能力 | | `openrouter_set_default` | 切换本会话默认模型 | ### 常见模式 ``` > 用 openrouter_chat 工具,model "deepseek/deepseek-chat" 总结 这份 50 页 PDF(便宜模型处理大批文本) > 用 openrouter_list_models 找输入成本 < $0.50/1M、上下文 > 100K 的模型 > 最难推理那步用 openrouter_chat + model "openai/o1", 然后回到 claude-haiku 处理样板 ``` ### 为啥用 OpenRouter MCP 而不是各 provider 的直连 MCP 每个 provider 一个 MCP(Anthropic MCP / OpenAI MCP 等)= N 个配置块、N 个 key、N 个工具前缀。OpenRouter MCP 一个配置 / 一个 key / 一个工具前缀覆盖全部。代价:所有依赖都靠 OpenRouter 可用性(uptime 好但是单点)。 --- ### FAQ **Q: OpenRouter MCP 是官方的吗?** A: OpenRouter 有多个社区做的 MCP 包装(npm 搜 'openrouter-mcp')。都用 OpenRouter REST API。最活跃的是社区的 `openrouter-mcp`。OpenRouter 自己还没出官方版。 **Q: 能限每次 MCP 调用的花费吗?** A: 能 —— 每次调用设 `max_tokens`,加上 OpenRouter 账号级支出上限。配合 PostHog 可观测性还能按功能强制预算。 **Q: 非编码 agent 能用吗?** A: 能 —— OpenRouter 的工具对任何文本任务都行。MCP 只是把它们暴露给 MCP 宿主。同样模式适用于聊天应用、内容生成、数据提取 agent。 --- ## 来源与感谢 > Built by community / [OpenRouter](https://github.com/OpenRouterTeam). MIT-licensed wrappers. > > [openrouter.ai](https://openrouter.ai) — OpenRouter platform --- Source: https://tokrepo.com/en/workflows/openrouter-mcp-one-server-for-300-llms-in-claude-code Author: OpenRouter