Quick Use
| Need | Best pick |
|---|---|
| Self-hosted, full control | LiteLLM |
| Fastest start, most models | OpenRouter |
| Caching + cost reduction | Cloudflare AI Gateway |
Intro
Every team running LLM apps faces the same question: which gateway should sit between the app and the model providers? This guide compares the three big options — LiteLLM (self-hosted proxy), OpenRouter (unified API), and Cloudflare AI Gateway (edge caching). Covers architecture, pricing, features, and best-fit scenarios. Many production setups use two or three simultaneously.
Source & Thanks
Based on official docs and community benchmarks, updated April 2026.