Practical Notes
- GitHub: 865 stars · 69 forks; pushed 2026-05-12 (verified via GitHub API).
- README shows
docker run -p 8080:8080and a first-call example to/v1/chat/completions. - README lists multiple providers (OpenAI, Anthropic, Gemini, DeepSeek, Groq, OpenRouter, Ollama, vLLM, Bedrock, etc.).
Main
Operational checklist for gateways:
- Prefer env files in production. README warns against passing secrets via
-eon the command line. - Log request IDs. Keep a request-id header in your clients so you can trace failures end-to-end.
- Start with one provider + one model. Add routing only after you have stable monitoring and quotas.
- Pin “model allowlists.” Expose only the models you approve (per README provider-model configuration patterns).
Treat the gateway as infrastructure: version it, monitor it, and keep credentials scoped and rotated.
FAQ
Q: Is it only for OpenAI? A: No—README says it provides an OpenAI-compatible API across many providers depending on credentials you set.
Q: Do I need to pass all keys? A: No—README says it detects available providers from the credentials you supply (at least one).
Q: How do I avoid leaking secrets?
A: Use --env-file instead of -e and keep .env out of git history.