[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"workflow-open-interpreter-local-code-interpreter-cli-405e1e05":3,"seo:featured-workflow:405e1e05-5c75-4703-9659-e65e181d3a2e:es":38,"workflow-related-open-interpreter-local-code-interpreter-cli-405e1e05-405e1e05-5c75-4703-9659-e65e181d3a2e":82},{"id":4,"uuid":5,"slug":6,"title":7,"description":8,"author_id":9,"author_name":10,"author_avatar":11,"token_estimate":12,"time_saved":12,"model_used":13,"fork_count":12,"vote_count":12,"view_count":14,"parent_id":12,"parent_uuid":13,"lang_type":15,"steps":16,"tags":23,"has_voted":28,"visibility":19,"share_token":13,"is_featured":12,"content_hash":29,"asset_kind":30,"target_tools":31,"install_mode":35,"entrypoint":36,"risk_profile":37,"dependencies":39,"verification":45,"agent_metadata":48,"agent_fit":59,"trust":70,"provenance":78,"created_at":80,"updated_at":81},3088,"405e1e05-5c75-4703-9659-e65e181d3a2e","open-interpreter-local-code-interpreter-cli","Open Interpreter — Local Code Interpreter CLI","Open Interpreter runs a local code interpreter in your terminal. Install from GitHub, run `interpreter`, then write files and run commands with approvals.","8a910e34-3180-11f1-9bc6-00163e2b0d79","Script Depot","https:\u002F\u002Ftokrepo.com\u002Fapple-touch-icon.png",0,"",14,"en",[17],{"id":18,"step_order":19,"title":20,"description":13,"prompt_template":21,"variables":13,"depends_on":22,"expected_output":13},3651,1,"Asset","# Open Interpreter — Local Code Interpreter CLI\n\n> Open Interpreter runs a local code interpreter in your terminal. Install from GitHub, run `interpreter`, then write files and run commands with approvals.\n\n## Quick Use\n\n1. Install:\n   ```bash\n   pip install git+https:\u002F\u002Fgithub.com\u002FOpenInterpreter\u002Fopen-interpreter.git\n   ```\n2. Run:\n   ```bash\n   interpreter\n   ```\n3. Verify:\n   - Ask it to create a small file and confirm it requests approval before executing shell commands (per repo defaults)\n\n\n---\n\n## Intro\n\nOpen Interpreter runs a local code interpreter in your terminal. Install from GitHub, run `interpreter`, then write files and run commands with approvals.\n\n- **Best for:** developers who want a local, unrestricted code-interpreter workflow (files + internet + packages) instead of a hosted sandbox\n- **Works with:** Python, terminal workflow, OpenAI or other providers (repo notes), local model servers via OpenAI-compatible API base\n- **Setup time:** 10 minutes\n\n\n### Quantitative Notes\n\n- CLI entrypoint: `interpreter` (repo)\n- Setup time ~10 minutes\n- GitHub stars (verified): see Source & Thanks\n\n\n---\n\n## Practical Notes\n\nUse Open Interpreter for hands-on local tasks: data cleanup, one-off scripts, codebase refactors, and automation that needs filesystem access. Keep a tight loop: ask for a plan, review commands before execution, and prefer small, reversible steps. If you share the machine, run it in a dedicated workspace directory and avoid granting wide permissions.\n\n**Safety note:** Local execution is risky; require approvals for shell commands and keep a narrow working directory.\n\n### FAQ\n\n**Q: Is it the same as ChatGPT Code Interpreter?**\nA: It aims for a similar workflow but runs locally, with access to your machine and fewer platform restrictions (per repo discussion).\n\n**Q: Can it run local models?**\nA: Yes. The README mentions using an OpenAI-compatible server by passing an `api_base` URL.\n\n**Q: How do I keep it safe?**\nA: Use confirmation gates, run in a sandboxed user account, and restrict directories it can access.\n\n---\n\n## Source & Thanks\n\n> GitHub: https:\u002F\u002Fgithub.com\u002Fopeninterpreter\u002Fopen-interpreter\n> Owner avatar: https:\u002F\u002Favatars.githubusercontent.com\u002Fu\u002F163192481?v=4\n> License (SPDX): AGPL-3.0\n> GitHub stars (verified via `api.github.com\u002Frepos\u002FOpenInterpreter\u002Fopen-interpreter`): 63,470\n\n\n---\n\n\u003C!-- ZH -->\n\n# Open Interpreter——本地可执行的代码解释器 CLI\n\n> Open Interpreter 把“代码解释器”带到你的终端，可在本地执行代码\u002F命令并生成文件。按仓库从 GitHub 安装后运行 `interpreter`，对危险操作启用确认机制更安全。\n\n## 快速使用\n\n1. 安装：\n   ```bash\n   pip install git+https:\u002F\u002Fgithub.com\u002FOpenInterpreter\u002Fopen-interpreter.git\n   ```\n2. 运行：\n   ```bash\n   interpreter\n   ```\n3. 验证：\n   - Ask it to create a small file and confirm it requests approval before executing shell commands (per repo defaults)\n\n\n---\n\n## 简介\n\nOpen Interpreter 把“代码解释器”带到你的终端，可在本地执行代码\u002F命令并生成文件。按仓库从 GitHub 安装后运行 `interpreter`，对危险操作启用确认机制更安全。\n\n- **适合谁（Best for）:** 希望在本地完成“代码解释器”工作流（文件+联网+装包），而不是依赖托管沙盒的开发者\n- **兼容工具（Works with）:** Python、终端工作流、OpenAI 或其他 provider（仓库说明）、可通过 OpenAI 兼容 API base 连接本地推理服务\n- **安装时间（Setup time）:** 10 分钟\n\n\n### 量化信息\n\n- CLI 入口：`interpreter`（仓库）\n- 装机约 10 分钟\n- GitHub stars（已核验）：见「来源与感谢」\n\n\n---\n\n## 实战要点\n\nOpen Interpreter 适合做本地“动手”任务：数据清洗、一次性脚本、代码库重构、需要访问文件系统的自动化。建议保持紧闭环：先要计划、执行前审查命令，并尽量用小步可回滚的操作。如果是共享机器，最好在专用工作目录里跑，避免授予过宽权限。\n\n**安全提示：** 本地执行有风险；对 shell 命令强制确认，并把工作目录收窄到必要范围。\n\n### FAQ\n\n**Q: 它和 ChatGPT 的 Code Interpreter 一样吗？**\nA: 目标是类似的工作流，但它在本地运行，能访问你的机器并减少平台限制（仓库对比段落）。\n\n**Q: 能跑本地模型吗？**\nA: 可以。README 提到可通过 OpenAI 兼容的推理服务，传入 `api_base` URL 来连接。\n\n**Q: 怎么更安全？**\nA: 打开确认闸门、在隔离账户\u002F容器里运行，并限制可访问目录范围。\n\n---\n\n## 来源与感谢\n\n> GitHub：https:\u002F\u002Fgithub.com\u002Fopeninterpreter\u002Fopen-interpreter\n> Owner avatar：https:\u002F\u002Favatars.githubusercontent.com\u002Fu\u002F163192481?v=4\n> 许可证（SPDX）：AGPL-3.0\n> GitHub stars（已通过 `api.github.com\u002Frepos\u002FOpenInterpreter\u002Fopen-interpreter` 核验）：63,470\n","0",[24],{"id":14,"name":25,"slug":26,"icon":27},"CLI Tools","cli","🖥️",false,"3d706f45cc4f652275e5d9a7f3350256038dbf7a3e6cc5d5c8845563d407085e","cli_tool",[32,33,34],"claude_code","codex","gemini_cli","single","README.md",{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},null,{"npm":40,"pip":41,"brew":43,"system":44},[],[42],"git+https:\u002F\u002Fgithub.com\u002FOpenInterpreter\u002Fopen-interpreter.git",[],[],{"commands":46,"expected_files":47},[],[20],{"asset_kind":30,"target_tools":49,"install_mode":35,"entrypoint":36,"risk_profile":50,"dependencies":51,"content_hash":29,"verification":56},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":52,"pip":53,"brew":54,"system":55},[],[42],[],[],{"commands":57,"expected_files":58},[],[20],{"target":33,"score":60,"status":61,"policy":61,"why":62,"asset_kind":30,"install_mode":35},29,"stage_only",[63,64,65,66,67,68,69],"target_tools includes codex","asset_kind cli_tool","install_mode single","markdown-only","policy stage_only","asset_kind cli_tool is not activated directly for Codex","trust established",{"author_trust_level":71,"verified_publisher":28,"asset_signed_hash":29,"signature_status":72,"install_count":12,"report_count":12,"dangerous_capability_badges":73,"review_status":74,"signals":75},"established","hash_only",[30],"unreviewed",[76,77],"author has published assets","content hash available",{"owner_uuid":9,"owner_name":10,"source_url":79,"content_hash":29,"visibility":19,"created_at":80,"updated_at":81},"https:\u002F\u002Ftokrepo.com\u002Fen\u002Fworkflows\u002Fopen-interpreter-local-code-interpreter-cli","2026-05-12 00:58:30","2026-05-14 08:12:48",[83,142,185,230],{"id":84,"uuid":85,"slug":86,"title":87,"description":88,"author_id":9,"author_name":10,"author_avatar":11,"token_estimate":12,"time_saved":12,"model_used":13,"fork_count":12,"vote_count":12,"view_count":89,"parent_id":12,"parent_uuid":13,"lang_type":15,"steps":90,"tags":91,"has_voted":28,"visibility":19,"share_token":13,"is_featured":12,"content_hash":93,"asset_kind":26,"target_tools":94,"install_mode":95,"entrypoint":96,"risk_profile":97,"dependencies":98,"verification":103,"agent_metadata":106,"agent_fit":117,"trust":126,"provenance":130,"created_at":132,"updated_at":133,"__relatedScore":134,"__relatedReasons":135,"__sharedTags":140},3585,"5dd7868f-2b86-555e-b417-6841b45ea0c1","llxprt-code-multi-provider-ai-coding-cli","LLxprt Code — Multi-Provider AI Coding CLI","LLxprt Code is an open-source AI coding CLI that switches across providers (Anthropic, Gemini, Codex, local). Install via brew or npm.",18,[],[92],{"id":14,"name":25,"slug":26,"icon":27},"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",[32,33,34],"brew","npm install -g @vybestack\u002Fllxprt-code",{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":99,"pip":100,"brew":101,"system":102},[],[],[],[],{"commands":104,"expected_files":105},[],[],{"asset_kind":26,"target_tools":107,"install_mode":95,"entrypoint":96,"risk_profile":108,"dependencies":109,"content_hash":93,"verification":114},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":110,"pip":111,"brew":112,"system":113},[],[],[],[],{"commands":115,"expected_files":116},[],[],{"target":33,"score":118,"status":119,"policy":120,"why":121,"asset_kind":26,"install_mode":95},94,"native","allow",[63,122,123,66,124,125,69],"asset_kind cli","install_mode brew","policy allow","safe markdown-only Codex install",{"author_trust_level":71,"verified_publisher":28,"asset_signed_hash":93,"signature_status":72,"install_count":12,"report_count":12,"dangerous_capability_badges":127,"review_status":74,"signals":128},[],[76,77,129],"no dangerous capability badges",{"owner_uuid":9,"owner_name":10,"source_url":131,"content_hash":93,"visibility":19,"created_at":132,"updated_at":133},"https:\u002F\u002Ftokrepo.com\u002Fen\u002Fworkflows\u002Fllxprt-code-multi-provider-ai-coding-cli","2026-05-13 16:22:02","2026-05-14 09:26:48",150.91813040142924,[136,137,138,139],"shared-tag","topic-match","same-target","same-author",[26,141],"cli-tools",{"id":143,"uuid":144,"slug":145,"title":146,"description":147,"author_id":9,"author_name":10,"author_avatar":11,"token_estimate":12,"time_saved":12,"model_used":13,"fork_count":12,"vote_count":12,"view_count":14,"parent_id":12,"parent_uuid":13,"lang_type":15,"steps":148,"tags":149,"has_voted":28,"visibility":19,"share_token":13,"is_featured":12,"content_hash":93,"asset_kind":30,"target_tools":151,"install_mode":35,"entrypoint":36,"risk_profile":152,"dependencies":153,"verification":158,"agent_metadata":161,"agent_fit":172,"trust":174,"provenance":177,"created_at":179,"updated_at":180,"__relatedScore":181,"__relatedReasons":182,"__sharedTags":184},3118,"0f4b8ec6-2063-4f50-a9e0-0158863e0943","kilo-code-ai-coding-assistant-cli","Kilo Code — AI Coding Assistant + CLI","An AI coding assistant for VS Code with an optional CLI. The repo documents an npm-installable CLI and mentions MCP server discovery as an extension path.",[],[150],{"id":14,"name":25,"slug":26,"icon":27},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":154,"pip":155,"brew":156,"system":157},[],[],[],[],{"commands":159,"expected_files":160},[],[],{"asset_kind":30,"target_tools":162,"install_mode":35,"entrypoint":36,"risk_profile":163,"dependencies":164,"content_hash":93,"verification":169},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":165,"pip":166,"brew":167,"system":168},[],[],[],[],{"commands":170,"expected_files":171},[],[],{"target":33,"score":60,"status":61,"policy":61,"why":173,"asset_kind":30,"install_mode":35},[63,64,65,66,67,68,69],{"author_trust_level":71,"verified_publisher":28,"asset_signed_hash":93,"signature_status":72,"install_count":12,"report_count":12,"dangerous_capability_badges":175,"review_status":74,"signals":176},[30],[76,77],{"owner_uuid":9,"owner_name":10,"source_url":178,"content_hash":93,"visibility":19,"created_at":179,"updated_at":180},"https:\u002F\u002Ftokrepo.com\u002Fen\u002Fworkflows\u002Fkilo-code-ai-coding-assistant-cli","2026-05-12 04:00:52","2026-05-14 09:26:24",145.76413688858352,[136,137,183,138,139],"same-kind",[26,141],{"id":186,"uuid":187,"slug":188,"title":189,"description":190,"author_id":191,"author_name":192,"author_avatar":11,"token_estimate":12,"time_saved":12,"model_used":13,"fork_count":12,"vote_count":12,"view_count":193,"parent_id":12,"parent_uuid":13,"lang_type":15,"steps":194,"tags":195,"has_voted":28,"visibility":19,"share_token":13,"is_featured":12,"content_hash":93,"asset_kind":30,"target_tools":197,"install_mode":35,"entrypoint":36,"risk_profile":198,"dependencies":199,"verification":204,"agent_metadata":207,"agent_fit":218,"trust":220,"provenance":223,"created_at":225,"updated_at":226,"__relatedScore":227,"__relatedReasons":228,"__sharedTags":229},3108,"25799792-379e-4d8c-a8c7-85034a2129d9","openllm-serve-open-source-llms","OpenLLM — Serve Open-Source LLMs","Serve open-source LLMs with a unified CLI, multiple backends, and production deployment paths. Start with `openllm hello`, then serve a real model.","8a911193-3180-11f1-9bc6-00163e2b0d79","AI Open Source",16,[],[196],{"id":14,"name":25,"slug":26,"icon":27},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":200,"pip":201,"brew":202,"system":203},[],[],[],[],{"commands":205,"expected_files":206},[],[],{"asset_kind":30,"target_tools":208,"install_mode":35,"entrypoint":36,"risk_profile":209,"dependencies":210,"content_hash":93,"verification":215},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":211,"pip":212,"brew":213,"system":214},[],[],[],[],{"commands":216,"expected_files":217},[],[],{"target":33,"score":60,"status":61,"policy":61,"why":219,"asset_kind":30,"install_mode":35},[63,64,65,66,67,68,69],{"author_trust_level":71,"verified_publisher":28,"asset_signed_hash":93,"signature_status":72,"install_count":12,"report_count":12,"dangerous_capability_badges":221,"review_status":74,"signals":222},[30],[76,77],{"owner_uuid":191,"owner_name":192,"source_url":224,"content_hash":93,"visibility":19,"created_at":225,"updated_at":226},"https:\u002F\u002Ftokrepo.com\u002Fen\u002Fworkflows\u002Fopenllm-serve-open-source-llms","2026-05-12 03:00:18","2026-05-14 08:16:26",129.84567338206742,[136,137,183,138],[26,141],{"id":231,"uuid":232,"slug":233,"title":234,"description":235,"author_id":9,"author_name":10,"author_avatar":11,"token_estimate":12,"time_saved":12,"model_used":13,"fork_count":12,"vote_count":12,"view_count":236,"parent_id":12,"parent_uuid":13,"lang_type":15,"steps":237,"tags":238,"has_voted":28,"visibility":19,"share_token":13,"is_featured":12,"content_hash":93,"asset_kind":30,"target_tools":240,"install_mode":35,"entrypoint":36,"risk_profile":241,"dependencies":242,"verification":247,"agent_metadata":250,"agent_fit":261,"trust":263,"provenance":266,"created_at":268,"updated_at":269,"__relatedScore":270,"__relatedReasons":271,"__sharedTags":272},3072,"e38149fa-1a44-425e-9c82-2bd8eb4d9c6c","graphify-repo-knowledge-graph-mcp","Graphify — Repo Knowledge Graph + MCP","Graphify extracts docs\u002Fcode into a knowledge graph and can install as an MCP\u002Fskill across Claude Code, Cursor, Codex, and Gemini CLI. Install via uv\u002Fpipx.",12,[],[239],{"id":14,"name":25,"slug":26,"icon":27},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":243,"pip":244,"brew":245,"system":246},[],[],[],[],{"commands":248,"expected_files":249},[],[],{"asset_kind":30,"target_tools":251,"install_mode":35,"entrypoint":36,"risk_profile":252,"dependencies":253,"content_hash":93,"verification":258},[32,33,34],{"executes_code":28,"modifies_global_config":28,"requires_secrets":38,"uses_absolute_paths":28,"network_access":28},{"npm":254,"pip":255,"brew":256,"system":257},[],[],[],[],{"commands":259,"expected_files":260},[],[],{"target":33,"score":60,"status":61,"policy":61,"why":262,"asset_kind":30,"install_mode":35},[63,64,65,66,67,68,69],{"author_trust_level":71,"verified_publisher":28,"asset_signed_hash":93,"signature_status":72,"install_count":12,"report_count":12,"dangerous_capability_badges":264,"review_status":74,"signals":265},[30],[76,77],{"owner_uuid":9,"owner_name":10,"source_url":267,"content_hash":93,"visibility":19,"created_at":268,"updated_at":269},"https:\u002F\u002Ftokrepo.com\u002Fen\u002Fworkflows\u002Fgraphify-repo-knowledge-graph-mcp","2026-05-11 23:58:36","2026-05-14 00:45:25",126.67091502846026,[136,137,183,138,139],[26,141]]