Esta página se muestra en inglés. Una traducción al español está en curso.
WorkflowsMay 13, 2026·2 min de lectura

AutoContext — Self-Improving Agent Harness

AutoContext adds iterative improvement loops, provider integrations, MCP access, and CLI workflows so coding-agent results improve across repeated runs.

Listo para agents

Este activo puede ser leído e instalado directamente por agents

TokRepo expone un comando CLI universal, contrato de instalación, metadata JSON, plan según adaptador y contenido raw para que los agents evalúen compatibilidad, riesgo y próximos pasos.

Native · 94/100Política: permitir
Superficie agent
Cualquier agent MCP/CLI
Tipo
Workflow
Instalación
Uv
Confianza
Confianza: Established
Entrada
uv tool install autocontext==0.5.0
Comando CLI universal
npx tokrepo install 7807e0fd-af51-57c0-9e1a-36a89fed3718
Introducción

AutoContext adds iterative improvement loops, provider integrations, MCP access, and CLI workflows so coding-agent results improve across repeated runs.

Best for: teams experimenting with recursive improvement loops and provider-agnostic agent evaluation instead of one-shot prompt runs

Works with: uv, Pi runtime, Claude Code MCP setups, multiple model providers, CLI and MCP-based agent workflows

Setup time: 8-15 minutes

Key facts (verified)

  • GitHub: 983 stars · 83 forks · pushed 2026-05-13.
  • License: Apache-2.0; owner avatar verified from GitHub API for greyhaven-ai.
  • Entry point checked from README: uv tool install autocontext==0.5.0.

Main

AutoContext is best read as a harness layer, not a single prompt package. Its job is to loop on scenarios, evaluate outputs, and feed improvements back into the next agent pass.

That matters when teams have moved past hobby usage and need repeatable improvement instead of ad hoc intuition about why a run succeeded.

The README gives two practical integration modes: CLI-first with Pi or provider env vars, and MCP-first inside Claude Code. That flexibility is a strong sign the project was built for real usage, not just demos.

Source-backed notes

  • README offers a 30-second path with uv tool install autocontext==0.5.0.
  • It documents provider-based operation for Anthropic, OpenAI, Gemini, Mistral, Groq, OpenRouter, Azure, Claude CLI, Codex CLI, and MLX.
  • Pi runtime and Claude Code MCP integration are both called out as supported paths.

FAQ

Q: Is AutoContext tied to one model provider? A: No. The README documents several providers plus Claude CLI, Codex CLI, and MLX paths.

Q: What is the fastest install path? A: uv tool install autocontext==0.5.0, then point it at Pi or another supported provider.

Q: Why use it? A: It helps repeated agent runs improve systematically instead of starting from scratch every time.

🙏

Fuente y agradecimientos

Source: https://github.com/greyhaven-ai/autocontext > License: Apache-2.0 > GitHub stars: 983 · forks: 83

Discusión

Inicia sesión para unirte a la discusión.
Aún no hay comentarios. Sé el primero en compartir tus ideas.

Activos relacionados