WorkflowsMay 8, 2026·4 min read

OpenRouter — Unified API for 300+ LLMs with Auto Failover

OpenRouter is one OpenAI-compatible endpoint for 300+ LLMs across 60+ providers. Transparent pricing, no markup, automatic failover when a route is down.

Agent ready

This asset can be read and installed directly by agents

TokRepo exposes a universal CLI command, install contract, metadata JSON, adapter-aware plan, and raw content links so agents can judge fit, risk, and next actions.

Stage only · 17/100Stage only
Agent surface
Any MCP/CLI agent
Kind
Skill
Install
Stage only
Trust
Trust: New
Entrypoint
Asset
Universal CLI install command
npx tokrepo install 7bb772b3-1ab0-4d27-a758-1cd9acc4f6ff
Intro

OpenRouter is the universal LLM gateway — one OpenAI-compatible endpoint that routes to 300+ models across Anthropic, OpenAI, Google, Meta, Mistral, DeepSeek, and 50+ other providers. Transparent pricing (no markup over the underlying provider), automatic failover when a route is down, BYOK to use your own provider account through OpenRouter's interface. Best for: developers who want one provider abstraction without giving up cost control. Works with: any OpenAI SDK. Setup time: 1 minute.


Hello, OpenRouter

from openai import OpenAI

client = OpenAI(
    base_url="https://openrouter.ai/api/v1",
    api_key=os.environ["OPENROUTER_API_KEY"],
)

response = client.chat.completions.create(
    model="anthropic/claude-3.5-sonnet",  # provider/model
    messages=[{"role": "user", "content": "Hello"}],
)

The same SDK works for openai/gpt-4o, google/gemini-2.5-pro, deepseek/deepseek-chat, meta-llama/llama-3.3-70b-instruct, etc.

Auto routing

response = client.chat.completions.create(
    model="openrouter/auto",  # let OpenRouter pick
    messages=[{"role": "user", "content": "Explain quantum tunneling"}],
)

Auto routing picks based on a balance of cost, latency, and capability for your prompt. Override with models=["anthropic/claude-3.5-sonnet", "openai/gpt-4o"] for explicit fallback list.

Provider preferences

extra_body = {
    "provider": {
        "order": ["anthropic", "openai"],   # try Anthropic first
        "allow_fallbacks": True,
        "data_collection": "deny",          # opt out of provider training
        "require_parameters": True,
    },
}

response = client.chat.completions.create(
    model="anthropic/claude-3.5-sonnet",
    messages=[...],
    extra_body=extra_body,
)

Why OpenRouter vs LiteLLM Proxy

OpenRouter LiteLLM Proxy
Hosting Hosted only Self-host or hosted
Provider count 300+ models / 60+ providers 100+ providers
Markup None (transparent) Self-host: zero
Auth / RBAC Per API key Self-managed
BYOK to providers Yes Yes
Best for Indie devs, prototypes Enterprise self-host

FAQ

Q: Is OpenRouter free? A: OpenRouter charges only what the underlying provider charges — no markup. There's a free tier of $1 in initial credits. Some models offer free tiers (e.g. Llama via Together's free endpoint, Gemini 2.5 free tier) accessible through OpenRouter.

Q: Will my API key work directly with provider APIs? A: No — OpenRouter API keys (sk-or-v1-...) only work with openrouter.ai/api/v1. Provider-direct keys are separate. You can BYOK provider keys to OpenRouter to route through them at provider rates.

Q: Does OpenRouter store my prompts? A: Default: yes (for analytics dashboards), but you can set data_collection: deny to opt out per-request, or globally in your OpenRouter account settings. Many providers also have their own opt-out — set both.


Quick Use

  1. Sign up at openrouter.ai → copy API key (starts sk-or-v1-)
  2. Set OpenAI SDK base_url to https://openrouter.ai/api/v1
  3. Use model="<provider>/<model>" (e.g. anthropic/claude-3.5-sonnet)

Intro

OpenRouter is the universal LLM gateway — one OpenAI-compatible endpoint that routes to 300+ models across Anthropic, OpenAI, Google, Meta, Mistral, DeepSeek, and 50+ other providers. Transparent pricing (no markup over the underlying provider), automatic failover when a route is down, BYOK to use your own provider account through OpenRouter's interface. Best for: developers who want one provider abstraction without giving up cost control. Works with: any OpenAI SDK. Setup time: 1 minute.


Hello, OpenRouter

from openai import OpenAI

client = OpenAI(
    base_url="https://openrouter.ai/api/v1",
    api_key=os.environ["OPENROUTER_API_KEY"],
)

response = client.chat.completions.create(
    model="anthropic/claude-3.5-sonnet",  # provider/model
    messages=[{"role": "user", "content": "Hello"}],
)

The same SDK works for openai/gpt-4o, google/gemini-2.5-pro, deepseek/deepseek-chat, meta-llama/llama-3.3-70b-instruct, etc.

Auto routing

response = client.chat.completions.create(
    model="openrouter/auto",  # let OpenRouter pick
    messages=[{"role": "user", "content": "Explain quantum tunneling"}],
)

Auto routing picks based on a balance of cost, latency, and capability for your prompt. Override with models=["anthropic/claude-3.5-sonnet", "openai/gpt-4o"] for explicit fallback list.

Provider preferences

extra_body = {
    "provider": {
        "order": ["anthropic", "openai"],   # try Anthropic first
        "allow_fallbacks": True,
        "data_collection": "deny",          # opt out of provider training
        "require_parameters": True,
    },
}

response = client.chat.completions.create(
    model="anthropic/claude-3.5-sonnet",
    messages=[...],
    extra_body=extra_body,
)

Why OpenRouter vs LiteLLM Proxy

OpenRouter LiteLLM Proxy
Hosting Hosted only Self-host or hosted
Provider count 300+ models / 60+ providers 100+ providers
Markup None (transparent) Self-host: zero
Auth / RBAC Per API key Self-managed
BYOK to providers Yes Yes
Best for Indie devs, prototypes Enterprise self-host

FAQ

Q: Is OpenRouter free? A: OpenRouter charges only what the underlying provider charges — no markup. There's a free tier of $1 in initial credits. Some models offer free tiers (e.g. Llama via Together's free endpoint, Gemini 2.5 free tier) accessible through OpenRouter.

Q: Will my API key work directly with provider APIs? A: No — OpenRouter API keys (sk-or-v1-...) only work with openrouter.ai/api/v1. Provider-direct keys are separate. You can BYOK provider keys to OpenRouter to route through them at provider rates.

Q: Does OpenRouter store my prompts? A: Default: yes (for analytics dashboards), but you can set data_collection: deny to opt out per-request, or globally in your OpenRouter account settings. Many providers also have their own opt-out — set both.


Source & Thanks

Built by OpenRouter. Commercial product with free credits.

openrouter.ai/docs — Official documentation

🙏

Source & Thanks

Built by OpenRouter. Commercial product with free credits.

openrouter.ai/docs — Official documentation

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets