Esta página se muestra en inglés. Una traducción al español está en curso.
ScriptsMar 28, 2026·2 min de lectura

Open Interpreter — Run Code via Natural Language

Run code locally through natural language. Supports Python, JavaScript, Shell and more with a ChatGPT-like interface on your own machine.

Introducción

Open Interpreter lets you control your computer through natural language. Ask it to create files, analyze data, edit images, browse the web, or run any code — it writes and executes Python, JavaScript, or Shell commands locally. Unlike ChatGPT's Code Interpreter, there are no file size limits, no timeout restrictions, and full internet access. 62,000+ GitHub stars, the most popular open-source local code execution agent.

Works with: Claude Code, GitHub Copilot, OpenAI Codex, Ollama ---

What You Can Do

  • Create and edit photos, videos, PDFs
  • Control a Chrome browser for research
  • Plot, clean, and analyze large datasets
  • Manage files and system settings
  • Run any Python/JS/Shell code through conversation

⚠️ Open Interpreter asks for your approval before executing each code block. You control what runs.

Usage

Terminal

interpreter

Python

from interpreter import interpreter

interpreter.chat("Plot AAPL and META's normalized stock prices")

Programmatic (non-interactive)

interpreter.chat("Add subtitles to all videos in /videos.")
interpreter.chat("Make the subtitles bigger?")  # Continues context

Streaming

for chunk in interpreter.chat(message, display=False, stream=True):
    print(chunk)

Configuration

Change model

interpreter --model gpt-3.5-turbo
interpreter --model claude-2
interpreter.llm.model = "gpt-3.5-turbo"

Use local models

interpreter --local
# or with custom endpoint
interpreter --api_base "http://localhost:1234/v1" --api_key "fake_key"

Customize system message

interpreter.system_message += "Run shell commands with -y so user confirmation isn't needed."

Context window & tokens

interpreter --local --max_tokens 1000 --context_window 3000

Interactive commands

  • %verbose true/false — Toggle debug output
  • %reset — Clear conversation
  • %undo — Remove last exchange
  • %tokens [prompt] — Check token usage and cost
  • %help — Show help

FastAPI Server

from fastapi import FastAPI
from fastapi.responses import StreamingResponse
from interpreter import interpreter

app = FastAPI()

@app.get("/chat")
def chat_endpoint(message: str):
    def event_stream():
        for result in interpreter.chat(message, stream=True):
            yield f"data: {result}\
\
"
    return StreamingResponse(event_stream(), media_type="text/event-stream")

Safety

  • Always reviews code before execution (unless auto_run=True)
  • Be cautious with file-modifying commands
  • Consider running in sandboxed environments (Codespaces, Replit)
  • Safe mode documentation available

FAQ

Q: What is Open Interpreter? A: Run code locally through natural language. Supports Python, JavaScript, Shell and more with a ChatGPT-like interface on your own machine.

Q: How do I install Open Interpreter? A: Check the Quick Use section above for step-by-step installation instructions. Most assets can be set up in under 2 minutes.

🙏

Fuente y agradecimientos

Created by Killian Lucas / OpenInterpreter. Licensed under AGPL-3.0. open-interpreter — ⭐ 62,800+

Thanks to Killian Lucas for creating the leading open-source local code execution interface. Not affiliated with OpenAI.

Discusión

Inicia sesión para unirte a la discusión.
Aún no hay comentarios. Sé el primero en compartir tus ideas.

Activos relacionados