Cette page est affichée en anglais. Une traduction française est en cours.
CLI ToolsMay 14, 2026·2 min de lecture

Orla — Execution Engine for Agentic Workflows

Orla is a runtime/library for multi-stage agentic workflows and routing; verified 244★ and ships a daemon plus Python SDK (`pyorla`).

Prêt pour agents

Cet actif peut être lu et installé directement par les agents

TokRepo expose une commande CLI universelle, un contrat d'installation, le metadata JSON, un plan selon l'adaptateur et le contenu raw pour aider les agents à juger l'adaptation, le risque et les prochaines actions.

Native · 94/100Policy : autoriser
Surface agent
Tout agent MCP/CLI
Type
Cli
Installation
Brew
Confiance
Confiance : Established
Point d'entrée
brew install --cask harvard-cns/orla/orla
Commande CLI universelle
npx tokrepo install 1a7eee44-32ea-5323-b7a1-e8d69df26610
Introduction

Orla is a runtime/library for multi-stage agentic workflows and routing; verified 244★ and ships a daemon plus Python SDK (pyorla).

Best for: Researchers and platform engineers who want a structured runtime for multi-stage LLM workflows and scheduling

Works with: Orla daemon (Homebrew cask) + optional Python SDK; see project website for latest docs

Setup time: 10-25 minutes

Key facts (verified)

  • GitHub: 244 stars · 7 forks · pushed 2026-05-13.
  • License: MIT · owner avatar + repo URL verified via GitHub API.
  • README-backed entrypoint: brew install --cask harvard-cns/orla/orla.

Main

  • Keep stages explicit so you can route models/backends without rewriting application logic.

  • Start with one workflow and measure latency per stage; only then scale out and add caching/memory policies.

  • Design for observability early: trace stage boundaries and store replayable inputs/outputs for debugging.

Source-backed notes

  • README installs the daemon via Homebrew cask and the client SDK via pip install pyorla.
  • README describes components: stage mapper, workflow orchestrator, and a memory manager for shared inference state.
  • README includes an academic citation to arXiv:2603.13605.

FAQ

  • Is this a prompt library?: No — it’s a runtime/engine; you still define your tasks and policies.
  • Do I need the Python SDK?: Only if your clients are Python; the daemon can be used independently per docs.
  • Is it production-ready?: Validate on your workloads first; treat it as an evolving, research-backed system.
🙏

Source et remerciements

Source: https://github.com/harvard-cns/orla > License: MIT > GitHub stars: 244 · forks: 7

Fil de discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires