Configs2026年4月6日·1 分钟阅读

Cloudflare AI Gateway — LLM Proxy, Cache & Analytics

Free proxy gateway for LLM API calls with caching, rate limiting, cost tracking, and fallback routing across providers. Reduce costs up to 95% with response caching. 7,000+ stars.

介绍

Cloudflare AI Gateway is a free LLM API proxy with 7,000+ GitHub stars. It provides caching, rate limiting, cost analytics, and failover routing. Caching responses can cut LLM costs by up to 95%. Ideal for teams running LLM apps in production who need cost control.


Quick Use

  1. Create an AI Gateway in the Cloudflare console
  2. Swap in the gateway API base URL:
client = OpenAI(
    base_url="https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway}/openai"
)

Intro

Cloudflare AI Gateway is a free LLM API proxy with 7,000+ GitHub stars. It provides caching, rate limiting, cost analytics, and failover routing. Caching responses can cut LLM costs by up to 95%. Ideal for teams running LLM apps in production who need cost control.


Source & Thanks

Created by Cloudflare. Licensed under Apache 2.0.

ai-gateway — ⭐ 7,000+

🙏

来源与感谢

Created by Cloudflare. Licensed under Apache 2.0.

ai-gateway — ⭐ 7,000+

讨论

登录后参与讨论。
还没有评论,来写第一条吧。

相关资产