Main
AutoContext 更像 harness 层,而不是一份单独 prompt。它的核心工作是围绕场景做循环评估,把改进结果继续反馈给下一轮 agent。
这对已经过了玩具阶段的团队很关键:你需要的是可重复改进,而不是靠经验猜测某次运行为什么成功。
README 给出了两条可执行集成路径:基于 Pi 或 provider 环境变量的 CLI 路径,以及 Claude Code 里的 MCP 路径。这说明它面向的是实际工作流,而不只是 demo。
Source-backed notes
- README offers a 30-second path with
uv tool install autocontext==0.5.0. - It documents provider-based operation for Anthropic, OpenAI, Gemini, Mistral, Groq, OpenRouter, Azure, Claude CLI, Codex CLI, and MLX.
- Pi runtime and Claude Code MCP integration are both called out as supported paths.
FAQ
问:Is AutoContext tied to one model provider? 答:No. The README documents several providers plus Claude CLI, Codex CLI, and MLX paths.
问:What is the fastest install path?
答:uv tool install autocontext==0.5.0, then point it at Pi or another supported provider.
问:Why use it? 答:It helps repeated agent runs improve systematically instead of starting from scratch every time.