CLI ToolsMay 14, 2026·2 min read

Orla — Execution Engine for Agentic Workflows

Orla is a runtime/library for multi-stage agentic workflows and routing; verified 244★ and ships a daemon plus Python SDK (`pyorla`).

Agent ready

This asset can be read and installed directly by agents

TokRepo exposes a universal CLI command, install contract, metadata JSON, adapter-aware plan, and raw content links so agents can judge fit, risk, and next actions.

Native · 94/100Policy: allow
Agent surface
Any MCP/CLI agent
Kind
Cli
Install
Brew
Trust
Trust: Established
Entrypoint
brew install --cask harvard-cns/orla/orla
Universal CLI install command
npx tokrepo install 1a7eee44-32ea-5323-b7a1-e8d69df26610
Intro

Orla is a runtime/library for multi-stage agentic workflows and routing; verified 244★ and ships a daemon plus Python SDK (pyorla).

Best for: Researchers and platform engineers who want a structured runtime for multi-stage LLM workflows and scheduling

Works with: Orla daemon (Homebrew cask) + optional Python SDK; see project website for latest docs

Setup time: 10-25 minutes

Key facts (verified)

  • GitHub: 244 stars · 7 forks · pushed 2026-05-13.
  • License: MIT · owner avatar + repo URL verified via GitHub API.
  • README-backed entrypoint: brew install --cask harvard-cns/orla/orla.

Main

  • Keep stages explicit so you can route models/backends without rewriting application logic.

  • Start with one workflow and measure latency per stage; only then scale out and add caching/memory policies.

  • Design for observability early: trace stage boundaries and store replayable inputs/outputs for debugging.

Source-backed notes

  • README installs the daemon via Homebrew cask and the client SDK via pip install pyorla.
  • README describes components: stage mapper, workflow orchestrator, and a memory manager for shared inference state.
  • README includes an academic citation to arXiv:2603.13605.

FAQ

  • Is this a prompt library?: No — it’s a runtime/engine; you still define your tasks and policies.
  • Do I need the Python SDK?: Only if your clients are Python; the daemon can be used independently per docs.
  • Is it production-ready?: Validate on your workloads first; treat it as an evolving, research-backed system.
🙏

Source & Thanks

Source: https://github.com/harvard-cns/orla > License: MIT > GitHub stars: 244 · forks: 7

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets