Workflows2026年4月8日·1 分钟阅读

Bifrost CLI — Run Claude Code with Any AI Model

Enterprise AI gateway that lets Claude Code use any LLM provider. Bifrost routes requests to OpenAI, Gemini, Bedrock, Groq, and 20+ providers with automatic failover.

What is Bifrost CLI?

Bifrost is an enterprise-grade AI gateway that unifies the APIs of 20+ providers, letting Claude Code switch models on demand. Supports automatic failover, semantic caching, and per-tier model overrides.

In one sentence: AI gateway letting Claude Code use any provider (OpenAI/Gemini/Bedrock/Groq, 20+ total), automatic failover, semantic caching, sub-100μs overhead — 3.6k+ stars.

For: Teams needing model flexibility and provider redundancy.

Core Features

1. Per-Tier Model Override

Use different providers for Opus/Sonnet/Haiku tiers.

2. Automatic Failover

Switch to a backup provider when the primary fails.

3. Real-Time Monitoring

Dashboard tracks requests, latency, and cost.

FAQ

Q: How does it compare to LiteLLM? A: Bifrost claims to be 50x faster with sub-100μs overhead.

🙏

来源与感谢

maximhq/bifrost — 3.6k+ stars, Apache 2.0

讨论

登录后参与讨论。
还没有评论,来写第一条吧。

相关资产