ScriptsMar 31, 2026·1 min read

OpenAI Swarm — Lightweight Multi-Agent Orchestration

Educational multi-agent framework by OpenAI. Ergonomic agent handoffs, tool calling, and context variables. Minimal abstraction over Chat Completions API. 21K+ stars.

TL;DR
OpenAI Swarm provides ergonomic agent handoffs and tool calling as a thin layer over Chat Completions.
§01

What it is

OpenAI Swarm is an educational multi-agent framework that demonstrates lightweight patterns for orchestrating multiple AI agents. It provides ergonomic agent handoffs, tool calling, and context variables with minimal abstraction over the Chat Completions API. The framework is intentionally simple: agents are functions, handoffs are return values, and context is a shared dictionary.

This tool is for developers learning multi-agent patterns and teams building proof-of-concept agent systems. OpenAI positions it as educational rather than production-grade.

§02

How it saves time or tokens

Swarm strips multi-agent orchestration to its essentials. Instead of complex frameworks with dozens of concepts, Swarm has three: agents, handoffs, and context variables. This makes it fast to prototype and easy to understand. The minimal abstraction means you control exactly what goes to the API, avoiding hidden token costs from framework overhead.

§03

How to use

  1. Install Swarm from the OpenAI GitHub repository.
  2. Define agents as objects with instructions and functions.
  3. Define handoff functions that return other agents.
  4. Run the swarm client to orchestrate.
# Install Swarm
pip install git+https://github.com/openai/swarm.git
§04

Example

from swarm import Swarm, Agent

client = Swarm()

def transfer_to_sales():
    return sales_agent

def transfer_to_support():
    return support_agent

triage_agent = Agent(
    name='Triage',
    instructions='Route the user to the right department.',
    functions=[transfer_to_sales, transfer_to_support]
)

sales_agent = Agent(
    name='Sales',
    instructions='Help the user with pricing and purchases.'
)

support_agent = Agent(
    name='Support',
    instructions='Help the user with technical issues.'
)

response = client.run(
    agent=triage_agent,
    messages=[{'role': 'user', 'content': 'My account is locked'}]
)

print(response.messages[-1]['content'])
# Support agent handles the locked account issue
§05

Related on TokRepo

§06

Common pitfalls

  • Swarm is educational and experimental. OpenAI does not recommend it for production workloads without significant hardening.
  • It only works with OpenAI models. There is no built-in support for Anthropic, Google, or local models.
  • Agent handoffs lose conversation history by default. Implement context passing explicitly if you need agents to share full conversation state.
  • Error handling is minimal. Production systems need retry logic, timeout handling, and fallback strategies.
  • The framework has no built-in persistence. Agent state is lost between runs unless you implement storage yourself.
  • Review the official documentation before deploying to production to ensure compatibility with your specific environment and requirements.
  • Start with default settings and customize incrementally. Changing too many configuration options at once makes debugging harder.

Frequently Asked Questions

Is Swarm production-ready?+

No. OpenAI positions Swarm as an educational framework for exploring multi-agent patterns. It lacks production features like error handling, persistence, monitoring, and scaling. Use it for prototyping and learning.

How do agent handoffs work?+

A handoff is a function that returns another agent. When the current agent decides a different agent should handle the request, it calls the handoff function. Swarm then switches execution to the returned agent.

Can Swarm use tools?+

Yes. Agents can have functions that act as tools. Swarm uses OpenAI's function calling to let agents decide when to invoke tools. Functions can return values, trigger handoffs, or update context.

What are context variables?+

Context variables are a shared dictionary passed between agents. They carry information like user identity, session state, or accumulated data. Any agent can read and update context variables.

How does Swarm compare to LangGraph or CrewAI?+

Swarm is much simpler and lighter. LangGraph and CrewAI are full frameworks with persistence, streaming, and production features. Swarm is a minimal pattern library. Choose Swarm for learning, LangGraph or CrewAI for production.

Citations (3)
🙏

Source & Thanks

Created by OpenAI. Licensed under MIT. openai/swarm — 21,000+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets