Esta página se muestra en inglés. Una traducción al español está en curso.
CLI ToolsMay 14, 2026·2 min de lectura

Orla — Execution Engine for Agentic Workflows

Orla is a runtime/library for multi-stage agentic workflows and routing; verified 244★ and ships a daemon plus Python SDK (`pyorla`).

Listo para agents

Este activo puede ser leído e instalado directamente por agents

TokRepo expone un comando CLI universal, contrato de instalación, metadata JSON, plan según adaptador y contenido raw para que los agents evalúen compatibilidad, riesgo y próximos pasos.

Native · 94/100Política: permitir
Superficie agent
Cualquier agent MCP/CLI
Tipo
Cli
Instalación
Brew
Confianza
Confianza: Established
Entrada
brew install --cask harvard-cns/orla/orla
Comando CLI universal
npx tokrepo install 1a7eee44-32ea-5323-b7a1-e8d69df26610
Introducción

Orla is a runtime/library for multi-stage agentic workflows and routing; verified 244★ and ships a daemon plus Python SDK (pyorla).

Best for: Researchers and platform engineers who want a structured runtime for multi-stage LLM workflows and scheduling

Works with: Orla daemon (Homebrew cask) + optional Python SDK; see project website for latest docs

Setup time: 10-25 minutes

Key facts (verified)

  • GitHub: 244 stars · 7 forks · pushed 2026-05-13.
  • License: MIT · owner avatar + repo URL verified via GitHub API.
  • README-backed entrypoint: brew install --cask harvard-cns/orla/orla.

Main

  • Keep stages explicit so you can route models/backends without rewriting application logic.

  • Start with one workflow and measure latency per stage; only then scale out and add caching/memory policies.

  • Design for observability early: trace stage boundaries and store replayable inputs/outputs for debugging.

Source-backed notes

  • README installs the daemon via Homebrew cask and the client SDK via pip install pyorla.
  • README describes components: stage mapper, workflow orchestrator, and a memory manager for shared inference state.
  • README includes an academic citation to arXiv:2603.13605.

FAQ

  • Is this a prompt library?: No — it’s a runtime/engine; you still define your tasks and policies.
  • Do I need the Python SDK?: Only if your clients are Python; the daemon can be used independently per docs.
  • Is it production-ready?: Validate on your workloads first; treat it as an evolving, research-backed system.
🙏

Fuente y agradecimientos

Source: https://github.com/harvard-cns/orla > License: MIT > GitHub stars: 244 · forks: 7

Discusión

Inicia sesión para unirte a la discusión.
Aún no hay comentarios. Sé el primero en compartir tus ideas.

Activos relacionados