# dario — Local LLM Router for Claude Subscriptions > Local LLM router for agent tools: exposes Anthropic + OpenAI-compatible APIs at http://localhost:3456; verified 197 stars and zero runtime deps. ## Install Copy the content below into your project: ## Quick Use ```bash npm install -g @askalf/dario dario login dario proxy export ANTHROPIC_BASE_URL=http://localhost:3456 export ANTHROPIC_API_KEY=dario ``` ## Intro Local LLM router for agent tools: exposes Anthropic + OpenAI-compatible APIs at http://localhost:3456; verified 197 stars and zero runtime deps. **Best for:** Claude Pro/Max users who want Cursor/Aider/Cline/Codex CLI to share one local endpoint instead of paying per-token APIs everywhere **Works with:** Any client that honors `ANTHROPIC_BASE_URL` or `OPENAI_BASE_URL`, local `localhost:3456` routing, optional Docker deployments **Setup time:** 5-10 minutes ## Main - The core promise is concrete: one local endpoint (`http://localhost:3456`) that can speak Anthropic Messages API and OpenAI Chat Completions API shapes. - It documents explicit subscription tiers (Pro $20 / Max 5x $100 / Max 20x $200) and positions the proxy as a way to avoid per-token spend for heavy tool users. - The repo highlights auditability and safety signals: zero runtime dependencies, SLSA-attested releases, and local credential storage under `~/.dario/`. ### FAQ - **Is it a hosted service?**: No. The README frames dario as a local single-user router you run on your own machine. - **Does it require changing tools?**: Usually no — point tools at `ANTHROPIC_BASE_URL` / `OPENAI_BASE_URL` and keep using the same workflows. - **What should I verify first?**: Start `dario proxy`, then run one short request from a tool you already use and confirm it routes via localhost:3456. ## Source & Thanks > Source: https://github.com/askalf/dario > License: MIT > GitHub stars: 197 · forks: 38 --- ## Quick Use ```bash npm install -g @askalf/dario dario login dario proxy export ANTHROPIC_BASE_URL=http://localhost:3456 export ANTHROPIC_API_KEY=dario ``` ## Intro 本地 LLM router:在 http://localhost:3456 同时提供 Anthropic 与 OpenAI 兼容接口;已核验 197 stars,零运行时依赖,可让多工具复用 Claude Pro/Max 订阅。 **Best for:** 想让 Cursor/Aider/Cline/Codex CLI 等多工具复用一个本地端点、优先用 Claude Pro/Max 订阅计费的用户 **Works with:** 任何支持 `ANTHROPIC_BASE_URL` 或 `OPENAI_BASE_URL` 的工具、本地 `localhost:3456` 路由,以及可选的 Docker 部署 **Setup time:** 5-10 minutes ## Main - 核心承诺很明确:一个本地端点(`http://localhost:3456`),同时支持 Anthropic Messages 与 OpenAI Chat Completions 的请求形状。 - README 直接给出订阅梯度(Pro $20 / Max 5x $100 / Max 20x $200),把代理定位为“高强度工具用户避免按 token 计费”的方案。 - 安全/可审计信号也很明确:零运行时依赖、每次 release SLSA 认证、凭据写在 `~/.dario/`(本地权限收紧)。 ### FAQ - **它是托管服务吗?**:不是。README 把 dario 定位为本地、单用户的路由器,需要你自己运行。 - **需要改工具用法吗?**:通常不用——把工具指向 `ANTHROPIC_BASE_URL` / `OPENAI_BASE_URL` 即可复用原工作流。 - **第一步怎么验证?**:先启动 `dario proxy`,再用你常用的工具发一次短请求,确认走的是 localhost:3456。 ## Source & Thanks > Source: https://github.com/askalf/dario > License: MIT > GitHub stars: 197 · forks: 38 --- Source: https://tokrepo.com/en/workflows/dario-local-llm-router-for-claude-subscriptions Author: Script Depot