# Open Interpreter — Local Code Interpreter CLI > Open Interpreter runs a local code interpreter in your terminal. Install from GitHub, run `interpreter`, then write files and run commands with approvals. ## Install Copy the content below into your project: # Open Interpreter — Local Code Interpreter CLI > Open Interpreter runs a local code interpreter in your terminal. Install from GitHub, run `interpreter`, then write files and run commands with approvals. ## Quick Use 1. Install: ```bash pip install git+https://github.com/OpenInterpreter/open-interpreter.git ``` 2. Run: ```bash interpreter ``` 3. Verify: - Ask it to create a small file and confirm it requests approval before executing shell commands (per repo defaults) --- ## Intro Open Interpreter runs a local code interpreter in your terminal. Install from GitHub, run `interpreter`, then write files and run commands with approvals. - **Best for:** developers who want a local, unrestricted code-interpreter workflow (files + internet + packages) instead of a hosted sandbox - **Works with:** Python, terminal workflow, OpenAI or other providers (repo notes), local model servers via OpenAI-compatible API base - **Setup time:** 10 minutes ### Quantitative Notes - CLI entrypoint: `interpreter` (repo) - Setup time ~10 minutes - GitHub stars (verified): see Source & Thanks --- ## Practical Notes Use Open Interpreter for hands-on local tasks: data cleanup, one-off scripts, codebase refactors, and automation that needs filesystem access. Keep a tight loop: ask for a plan, review commands before execution, and prefer small, reversible steps. If you share the machine, run it in a dedicated workspace directory and avoid granting wide permissions. **Safety note:** Local execution is risky; require approvals for shell commands and keep a narrow working directory. ### FAQ **Q: Is it the same as ChatGPT Code Interpreter?** A: It aims for a similar workflow but runs locally, with access to your machine and fewer platform restrictions (per repo discussion). **Q: Can it run local models?** A: Yes. The README mentions using an OpenAI-compatible server by passing an `api_base` URL. **Q: How do I keep it safe?** A: Use confirmation gates, run in a sandboxed user account, and restrict directories it can access. --- ## Source & Thanks > GitHub: https://github.com/openinterpreter/open-interpreter > Owner avatar: https://avatars.githubusercontent.com/u/163192481?v=4 > License (SPDX): AGPL-3.0 > GitHub stars (verified via `api.github.com/repos/OpenInterpreter/open-interpreter`): 63,470 --- # Open Interpreter——本地可执行的代码解释器 CLI > Open Interpreter 把“代码解释器”带到你的终端,可在本地执行代码/命令并生成文件。按仓库从 GitHub 安装后运行 `interpreter`,对危险操作启用确认机制更安全。 ## 快速使用 1. 安装: ```bash pip install git+https://github.com/OpenInterpreter/open-interpreter.git ``` 2. 运行: ```bash interpreter ``` 3. 验证: - Ask it to create a small file and confirm it requests approval before executing shell commands (per repo defaults) --- ## 简介 Open Interpreter 把“代码解释器”带到你的终端,可在本地执行代码/命令并生成文件。按仓库从 GitHub 安装后运行 `interpreter`,对危险操作启用确认机制更安全。 - **适合谁(Best for):** 希望在本地完成“代码解释器”工作流(文件+联网+装包),而不是依赖托管沙盒的开发者 - **兼容工具(Works with):** Python、终端工作流、OpenAI 或其他 provider(仓库说明)、可通过 OpenAI 兼容 API base 连接本地推理服务 - **安装时间(Setup time):** 10 分钟 ### 量化信息 - CLI 入口:`interpreter`(仓库) - 装机约 10 分钟 - GitHub stars(已核验):见「来源与感谢」 --- ## 实战要点 Open Interpreter 适合做本地“动手”任务:数据清洗、一次性脚本、代码库重构、需要访问文件系统的自动化。建议保持紧闭环:先要计划、执行前审查命令,并尽量用小步可回滚的操作。如果是共享机器,最好在专用工作目录里跑,避免授予过宽权限。 **安全提示:** 本地执行有风险;对 shell 命令强制确认,并把工作目录收窄到必要范围。 ### FAQ **Q: 它和 ChatGPT 的 Code Interpreter 一样吗?** A: 目标是类似的工作流,但它在本地运行,能访问你的机器并减少平台限制(仓库对比段落)。 **Q: 能跑本地模型吗?** A: 可以。README 提到可通过 OpenAI 兼容的推理服务,传入 `api_base` URL 来连接。 **Q: 怎么更安全?** A: 打开确认闸门、在隔离账户/容器里运行,并限制可访问目录范围。 --- ## 来源与感谢 > GitHub:https://github.com/openinterpreter/open-interpreter > Owner avatar:https://avatars.githubusercontent.com/u/163192481?v=4 > 许可证(SPDX):AGPL-3.0 > GitHub stars(已通过 `api.github.com/repos/OpenInterpreter/open-interpreter` 核验):63,470 --- Source: https://tokrepo.com/en/workflows/open-interpreter-local-code-interpreter-cli Author: Script Depot