# OpenRouter — Unified API for 300+ LLMs with Auto Failover > OpenRouter is one OpenAI-compatible endpoint for 300+ LLMs across 60+ providers. Transparent pricing, no markup, automatic failover when a route is down. ## Install Copy the content below into your project: ## Quick Use 1. Sign up at openrouter.ai → copy API key (starts `sk-or-v1-`) 2. Set OpenAI SDK base_url to `https://openrouter.ai/api/v1` 3. Use `model="/"` (e.g. `anthropic/claude-3.5-sonnet`) --- ## Intro OpenRouter is the universal LLM gateway — one OpenAI-compatible endpoint that routes to 300+ models across Anthropic, OpenAI, Google, Meta, Mistral, DeepSeek, and 50+ other providers. Transparent pricing (no markup over the underlying provider), automatic failover when a route is down, BYOK to use your own provider account through OpenRouter's interface. Best for: developers who want one provider abstraction without giving up cost control. Works with: any OpenAI SDK. Setup time: 1 minute. --- ### Hello, OpenRouter ```python from openai import OpenAI client = OpenAI( base_url="https://openrouter.ai/api/v1", api_key=os.environ["OPENROUTER_API_KEY"], ) response = client.chat.completions.create( model="anthropic/claude-3.5-sonnet", # provider/model messages=[{"role": "user", "content": "Hello"}], ) ``` The same SDK works for `openai/gpt-4o`, `google/gemini-2.5-pro`, `deepseek/deepseek-chat`, `meta-llama/llama-3.3-70b-instruct`, etc. ### Auto routing ```python response = client.chat.completions.create( model="openrouter/auto", # let OpenRouter pick messages=[{"role": "user", "content": "Explain quantum tunneling"}], ) ``` Auto routing picks based on a balance of cost, latency, and capability for your prompt. Override with `models=["anthropic/claude-3.5-sonnet", "openai/gpt-4o"]` for explicit fallback list. ### Provider preferences ```python extra_body = { "provider": { "order": ["anthropic", "openai"], # try Anthropic first "allow_fallbacks": True, "data_collection": "deny", # opt out of provider training "require_parameters": True, }, } response = client.chat.completions.create( model="anthropic/claude-3.5-sonnet", messages=[...], extra_body=extra_body, ) ``` ### Why OpenRouter vs LiteLLM Proxy | | OpenRouter | LiteLLM Proxy | |---|---|---| | Hosting | Hosted only | Self-host or hosted | | Provider count | 300+ models / 60+ providers | 100+ providers | | Markup | None (transparent) | Self-host: zero | | Auth / RBAC | Per API key | Self-managed | | BYOK to providers | Yes | Yes | | Best for | Indie devs, prototypes | Enterprise self-host | --- ### FAQ **Q: Is OpenRouter free?** A: OpenRouter charges only what the underlying provider charges — no markup. There's a free tier of $1 in initial credits. Some models offer free tiers (e.g. Llama via Together's free endpoint, Gemini 2.5 free tier) accessible through OpenRouter. **Q: Will my API key work directly with provider APIs?** A: No — OpenRouter API keys (`sk-or-v1-...`) only work with `openrouter.ai/api/v1`. Provider-direct keys are separate. You can BYOK provider keys to OpenRouter to route through them at provider rates. **Q: Does OpenRouter store my prompts?** A: Default: yes (for analytics dashboards), but you can set `data_collection: deny` to opt out per-request, or globally in your OpenRouter account settings. Many providers also have their own opt-out — set both. --- ## Source & Thanks > Built by [OpenRouter](https://github.com/OpenRouterTeam). Commercial product with free credits. > > [openrouter.ai/docs](https://openrouter.ai/docs) — Official documentation --- ## 快速使用 1. 在 openrouter.ai 注册,复制 API key(`sk-or-v1-` 开头) 2. 把 OpenAI SDK base_url 设成 `https://openrouter.ai/api/v1` 3. 用 `model="/"`(比如 `anthropic/claude-3.5-sonnet`) --- ## 简介 OpenRouter 是通用 LLM 网关 —— 一个 OpenAI 兼容端点路由到 Anthropic / OpenAI / Google / Meta / Mistral / DeepSeek 等 60+ provider 的 300+ 模型。价格透明(不在底层 provider 之上加价)、route 挂时自动 failover、BYOK 让你通过 OpenRouter 接口用自己的 provider 账号。适合想要一个 provider 抽象又不放弃成本控制的开发者。兼容任何 OpenAI SDK。装机时间 1 分钟。 --- ### Hello, OpenRouter ```python from openai import OpenAI client = OpenAI( base_url="https://openrouter.ai/api/v1", api_key=os.environ["OPENROUTER_API_KEY"], ) response = client.chat.completions.create( model="anthropic/claude-3.5-sonnet", # provider/model messages=[{"role": "user", "content": "Hello"}], ) ``` 同一 SDK 配 `openai/gpt-4o`、`google/gemini-2.5-pro`、`deepseek/deepseek-chat`、`meta-llama/llama-3.3-70b-instruct` 都能用。 ### Auto 路由 ```python response = client.chat.completions.create( model="openrouter/auto", # 让 OpenRouter 自己选 messages=[{"role": "user", "content": "Explain quantum tunneling"}], ) ``` Auto 路由按成本、延迟、能力的平衡为你的 prompt 选模型。要显式 fallback 列表用 `models=["anthropic/claude-3.5-sonnet", "openai/gpt-4o"]`。 ### Provider 偏好 ```python extra_body = { "provider": { "order": ["anthropic", "openai"], # 优先 Anthropic "allow_fallbacks": True, "data_collection": "deny", # 拒绝 provider 训练 "require_parameters": True, }, } response = client.chat.completions.create( model="anthropic/claude-3.5-sonnet", messages=[...], extra_body=extra_body, ) ``` ### OpenRouter vs LiteLLM Proxy | | OpenRouter | LiteLLM Proxy | |---|---|---| | 托管 | 仅托管 | 自托管或托管 | | Provider 数 | 300+ 模型 / 60+ provider | 100+ provider | | 加价 | 无(透明) | 自托管:零 | | 鉴权 / RBAC | 每个 API key | 自管 | | BYOK provider | 支持 | 支持 | | 适合 | 独立开发者、原型 | 企业自托管 | --- ### FAQ **Q: OpenRouter 免费吗?** A: OpenRouter 只收底层 provider 收的钱 —— 不加价。有 $1 初始 credit 免费档。某些模型有免费档(比如 Llama 通过 Together 免费端点、Gemini 2.5 免费档)通过 OpenRouter 可访问。 **Q: 我的 API key 直接配 provider API 能用吗?** A: 不能 —— OpenRouter API key(`sk-or-v1-...`)只在 `openrouter.ai/api/v1` 工作。直连 provider 的 key 是独立的。可以把 provider key 给 OpenRouter(BYOK)按 provider 价过路由。 **Q: OpenRouter 存我的 prompt 吗?** A: 默认:存(用于分析仪表盘),但可以每请求设 `data_collection: deny` 拒绝、或在 OpenRouter 账号设置全局拒绝。很多 provider 也有自己的 opt-out —— 两个都设。 --- ## 来源与感谢 > Built by [OpenRouter](https://github.com/OpenRouterTeam). Commercial product with free credits. > > [openrouter.ai/docs](https://openrouter.ai/docs) — Official documentation --- Source: https://tokrepo.com/en/workflows/openrouter-unified-api-for-300-llms-with-auto-failover Author: OpenRouter