Cette page est affichée en anglais. Une traduction française est en cours.
MCP ConfigsMay 8, 2026·4 min de lecture

OpenRouter MCP — One Server for 300+ LLMs in Claude Code

OpenRouter MCP exposes all 300+ OpenRouter models to Claude Code, Cursor, Codex CLI as one MCP server. Switch models per task, BYO routing, no extra SDKs.

Prêt pour agents

Cet actif peut être lu et installé directement par les agents

TokRepo expose une commande CLI universelle, un contrat d'installation, le metadata JSON, un plan selon l'adaptateur et le contenu raw pour aider les agents à juger l'adaptation, le risque et les prochaines actions.

Stage only · 5/100Stage only
Surface agent
Tout agent MCP/CLI
Type
Mcp Config
Installation
Stage only
Confiance
Confiance : New
Point d'entrée
Asset
Commande CLI universelle
npx tokrepo install 80e8253c-8910-4d4b-ab35-4c9771e8150a
Introduction

OpenRouter MCP wraps OpenRouter's universal LLM gateway as a Model Context Protocol server. Inside Claude Code, Cursor, or Codex CLI, you can call openrouter_chat and pick any of 300+ models per task — switch from Claude to GPT-4o to Llama 3.3 mid-session without changing your host. Best for: agents that want to delegate sub-tasks to cheaper or specialized models. Works with: any MCP host. Setup time: 3 minutes.


MCP config

{
  "mcpServers": {
    "openrouter": {
      "command": "npx",
      "args": ["-y", "openrouter-mcp"],
      "env": {
        "OPENROUTER_API_KEY": "sk-or-v1-...",
        "OPENROUTER_DEFAULT_MODEL": "anthropic/claude-3.5-haiku"
      }
    }
  }
}

Tools exposed

Tool Use
openrouter_chat Send a chat completion via OpenRouter
openrouter_list_models Filter the 300+ catalog by capability / price / context
openrouter_get_model Cost, context, capabilities for one model
openrouter_set_default Switch the default model for this session

Common patterns

> use the openrouter_chat tool with model "deepseek/deepseek-chat" to summarize
  this 50-page PDF (cheap model for bulk text)

> use openrouter_list_models to find models with <$0.50/1M input cost and
  >100K context window

> use openrouter_chat with model "openai/o1" for the hardest reasoning step,
  then back to claude-haiku for the boilerplate

Why use OpenRouter MCP vs direct provider MCPs

A separate MCP per provider (Anthropic MCP, OpenAI MCP, …) means N config blocks, N keys, N tool prefixes. OpenRouter MCP gives you one config / one key / one tool prefix for all of them. Trade-off: you depend on OpenRouter's availability for everything (their uptime is good but it's a single point of failure).


FAQ

Q: Is OpenRouter MCP official? A: OpenRouter has multiple community-built MCP wrappers (search npm for 'openrouter-mcp'). They all use OpenRouter's REST API. The most active is openrouter-mcp by the community. OpenRouter itself doesn't ship an official one yet.

Q: Can I cap spend per MCP call? A: Yes — set max_tokens in each call, and use OpenRouter's account-level spend limit. Combined with PostHog observability you can also enforce per-feature budgets.

Q: Does this work for non-coding agents? A: Yes — OpenRouter's tools work for any text task. The MCP just exposes them to MCP hosts. Same patterns work for chat apps, content generation, data extraction agents.


Quick Use

  1. Sign up at openrouter.ai → API key (starts sk-or-v1-)
  2. Add the JSON snippet below to your MCP config
  3. Restart your MCP host — openrouter_chat and friends are now available

Intro

OpenRouter MCP wraps OpenRouter's universal LLM gateway as a Model Context Protocol server. Inside Claude Code, Cursor, or Codex CLI, you can call openrouter_chat and pick any of 300+ models per task — switch from Claude to GPT-4o to Llama 3.3 mid-session without changing your host. Best for: agents that want to delegate sub-tasks to cheaper or specialized models. Works with: any MCP host. Setup time: 3 minutes.


MCP config

{
  "mcpServers": {
    "openrouter": {
      "command": "npx",
      "args": ["-y", "openrouter-mcp"],
      "env": {
        "OPENROUTER_API_KEY": "sk-or-v1-...",
        "OPENROUTER_DEFAULT_MODEL": "anthropic/claude-3.5-haiku"
      }
    }
  }
}

Tools exposed

Tool Use
openrouter_chat Send a chat completion via OpenRouter
openrouter_list_models Filter the 300+ catalog by capability / price / context
openrouter_get_model Cost, context, capabilities for one model
openrouter_set_default Switch the default model for this session

Common patterns

> use the openrouter_chat tool with model "deepseek/deepseek-chat" to summarize
  this 50-page PDF (cheap model for bulk text)

> use openrouter_list_models to find models with <$0.50/1M input cost and
  >100K context window

> use openrouter_chat with model "openai/o1" for the hardest reasoning step,
  then back to claude-haiku for the boilerplate

Why use OpenRouter MCP vs direct provider MCPs

A separate MCP per provider (Anthropic MCP, OpenAI MCP, …) means N config blocks, N keys, N tool prefixes. OpenRouter MCP gives you one config / one key / one tool prefix for all of them. Trade-off: you depend on OpenRouter's availability for everything (their uptime is good but it's a single point of failure).


FAQ

Q: Is OpenRouter MCP official? A: OpenRouter has multiple community-built MCP wrappers (search npm for 'openrouter-mcp'). They all use OpenRouter's REST API. The most active is openrouter-mcp by the community. OpenRouter itself doesn't ship an official one yet.

Q: Can I cap spend per MCP call? A: Yes — set max_tokens in each call, and use OpenRouter's account-level spend limit. Combined with PostHog observability you can also enforce per-feature budgets.

Q: Does this work for non-coding agents? A: Yes — OpenRouter's tools work for any text task. The MCP just exposes them to MCP hosts. Same patterns work for chat apps, content generation, data extraction agents.


Source & Thanks

Built by community / OpenRouter. MIT-licensed wrappers.

openrouter.ai — OpenRouter platform

🙏

Source et remerciements

Built by community / OpenRouter. MIT-licensed wrappers.

openrouter.ai — OpenRouter platform

Fil de discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires