Main
Treat traces as a first-class artifact: README stores JSONL traces under
traces/and includes replay support.Use the eval harness early (
--n 50) to guard against regressions when you add tools, prompts, or provider switches.Start with the CLI, then graduate to the HTTP API when you need multi-client access or UI integrations.
Source-backed notes
- README quickstart shows
micro-agent ask ... --utc, a FastAPI server viauvicorn, and evals viapython evals/run_evals.py --n 50. - Docs describe provider config for OpenAI and Ollama, plus tracing under
traces/<id>.jsonl. - README lists eval metrics like success_rate/avg_latency_sec/avg_cost_usd and notes usage/cost capture can be best-effort.
FAQ
- Is this a full framework?: No — README frames it as a minimal runtime + DSPy modules you can read end-to-end.
- Can I run it without OpenAI?: Yes — README includes an Ollama provider path via env vars.
- How do I keep changes safe?: Use the built-in eval harness and store/replay traces to compare behavior over time.