# LLxprt Code — Multi-Provider AI Coding CLI > LLxprt Code is an open-source AI coding CLI that switches across providers (Anthropic, Gemini, Codex, local). Install via brew or npm. ## Install Copy the content below into your project: ## Quick Use ```bash brew tap vybestack/homebrew-tap brew update && brew install llxprt-code llxprt # then in REPL: /provider anthropic (or gemini/qwen/codex) ``` ## Intro LLxprt Code is an open-source AI coding CLI that switches across providers (Anthropic, Gemini, Codex, local). Install via brew or npm. **Best for:** Terminal-first developers who want one CLI for multiple LLM providers **Works with:** Node.js 20+ (npm path) or Homebrew, multiple provider auth flows **Setup time:** 5-12 minutes ### Key facts (verified) - GitHub: 673 stars · 89 forks · pushed 2026-05-13. - License: Apache-2.0 · owner avatar + repo URL verified via GitHub API. - README-verified entrypoint: `npm install -g @vybestack/llxprt-code`. ## Main - Pick your provider strategy first: use free tiers (Gemini/Qwen), OAuth for subscriptions (Claude/Codex), or point to a local OpenAI-compatible base URL. - Use non-interactive mode for automation (CI scripts) and interactive REPL for exploratory coding and refactors. - When you need tools, connect external MCP integrations (README notes MCP support) and keep secrets in provider-specific configs. ### Source-backed notes - README shows provider commands like `/auth` + `/provider` + `/model` for Gemini/Qwen/Anthropic/Codex. - README includes install options via Homebrew and npm (`npm install -g @vybestack/llxprt-code`) and a no-install npx example. - README describes both interactive REPL workflows and non-interactive single-command mode for automation. ### FAQ - **Do I have to use one provider?**: No — README is built around switching providers and models per session. - **Is it only interactive?**: No — README shows non-interactive mode for scripted usage and CI-style automation. - **Can I use it with local models?**: Yes — README mentions local providers (LM Studio/llama.cpp/Ollama) and OpenAI-compatible base URLs. ## Source & Thanks > Source: https://github.com/vybestack/llxprt-code > License: Apache-2.0 > GitHub stars: 673 · forks: 89 --- ## Quick Use ```bash brew tap vybestack/homebrew-tap brew update && brew install llxprt-code llxprt # then in REPL: /provider anthropic (or gemini/qwen/codex) ``` ## Intro LLxprt Code 是开源终端编码助手,支持多 provider(Anthropic、Gemini、OpenAI/Codex、本地模型等)。README 给出 brew/npm 安装与交互式 REPL 用法,可直接在终端做开发自动化。 **Best for:** 终端优先、想用一个 CLI 统一多个模型 Provider 的开发者 **Works with:** Node.js 20+(npm 路线)或 Homebrew;多 provider 的认证流程 **Setup time:** 5-12 minutes ### Key facts (verified) - GitHub:673 stars · 89 forks;最近更新 2026-05-13。 - 许可证:Apache-2.0;作者头像与仓库链接均已通过 GitHub API 复核。 - README 中核对过的入口命令:`npm install -g @vybestack/llxprt-code`。 ## Main - 先确定 provider 策略:可用免费层(Gemini/Qwen)、订阅 OAuth(Claude/Codex),或对接本地 OpenAI 兼容网关。 - 自动化场景走非交互模式(脚本/CI),探索与重构走交互式 REPL。 - 需要外部工具时接入 MCP(README 提到 MCP 集成),并把密钥放在各 provider 的配置里集中管理。 ### Source-backed notes - README 展示了 `/auth` + `/provider` + `/model` 的切换方式,覆盖 Gemini/Qwen/Anthropic/Codex 等。 - README 提供 Homebrew 与 npm 两种安装方式(如 `npm install -g @vybestack/llxprt-code`),并给出 npx 免安装示例。 - README 区分交互式 REPL 与非交互单命令模式,后者更适合自动化脚本。 ### FAQ - **必须固定一个 provider 吗?**:不必。README 的核心就是可在会话里切换 provider 与模型。 - **只能交互式使用吗?**:不是。README 提供非交互模式,适合脚本与 CI 自动化。 - **能接本地模型吗?**:可以。README 提到 LM Studio/llama.cpp/Ollama 等本地方案以及 OpenAI 兼容 base URL。 ## Source & Thanks > Source: https://github.com/vybestack/llxprt-code > License: Apache-2.0 > GitHub stars: 673 · forks: 89 --- Source: https://tokrepo.com/en/workflows/llxprt-code-multi-provider-ai-coding-cli Author: Script Depot