WorkflowsMay 13, 2026·2 min read

AutoContext — Self-Improving Agent Harness

AutoContext adds iterative improvement loops, provider integrations, MCP access, and CLI workflows so coding-agent results improve across repeated runs.

Agent ready

This asset can be read and installed directly by agents

TokRepo exposes a universal CLI command, install contract, metadata JSON, adapter-aware plan, and raw content links so agents can judge fit, risk, and next actions.

Native · 94/100Policy: allow
Agent surface
Any MCP/CLI agent
Kind
Workflow
Install
Uv
Trust
Trust: Established
Entrypoint
uv tool install autocontext==0.5.0
Universal CLI install command
npx tokrepo install 7807e0fd-af51-57c0-9e1a-36a89fed3718
Intro

AutoContext adds iterative improvement loops, provider integrations, MCP access, and CLI workflows so coding-agent results improve across repeated runs.

Best for: teams experimenting with recursive improvement loops and provider-agnostic agent evaluation instead of one-shot prompt runs

Works with: uv, Pi runtime, Claude Code MCP setups, multiple model providers, CLI and MCP-based agent workflows

Setup time: 8-15 minutes

Key facts (verified)

  • GitHub: 983 stars · 83 forks · pushed 2026-05-13.
  • License: Apache-2.0; owner avatar verified from GitHub API for greyhaven-ai.
  • Entry point checked from README: uv tool install autocontext==0.5.0.

Main

AutoContext is best read as a harness layer, not a single prompt package. Its job is to loop on scenarios, evaluate outputs, and feed improvements back into the next agent pass.

That matters when teams have moved past hobby usage and need repeatable improvement instead of ad hoc intuition about why a run succeeded.

The README gives two practical integration modes: CLI-first with Pi or provider env vars, and MCP-first inside Claude Code. That flexibility is a strong sign the project was built for real usage, not just demos.

Source-backed notes

  • README offers a 30-second path with uv tool install autocontext==0.5.0.
  • It documents provider-based operation for Anthropic, OpenAI, Gemini, Mistral, Groq, OpenRouter, Azure, Claude CLI, Codex CLI, and MLX.
  • Pi runtime and Claude Code MCP integration are both called out as supported paths.

FAQ

Q: Is AutoContext tied to one model provider? A: No. The README documents several providers plus Claude CLI, Codex CLI, and MLX paths.

Q: What is the fastest install path? A: uv tool install autocontext==0.5.0, then point it at Pi or another supported provider.

Q: Why use it? A: It helps repeated agent runs improve systematically instead of starting from scratch every time.

🙏

Source & Thanks

Source: https://github.com/greyhaven-ai/autocontext > License: Apache-2.0 > GitHub stars: 983 · forks: 83

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets