DeerFlow — Architecture & Features
Core Components
| Component | Purpose |
|---|---|
| Sub-Agents | Hierarchical agent orchestration — delegate and specialize |
| Sandboxes | Docker/local/K8s isolation for safe code execution |
| Memory System | Persistent context across sessions |
| Skills | Composable tools: web search, code execution, file ops |
| Context Engineering | Sophisticated prompt management and state handling |
| Multi-Channel | Telegram, Slack, Feishu/Lark, WeCom |
Deployment Modes
| Mode | Description |
|---|---|
| Standard | Separate LangGraph server + frontend |
| Gateway | Embedded agent runtime — fewer processes, lower overhead |
| Docker | make docker-start — recommended for production |
| Kubernetes | K8s-ready with sandbox isolation |
Tech Stack
- Backend: Python 3.12+, LangChain, LangGraph, FastAPI
- Frontend: Node.js 22+, React/TypeScript
- Observability: LangSmith/Langfuse tracing
- Package Management:
uv(Python),pnpm(Node.js)
Supported LLM Providers
OpenAI, Anthropic Claude, OpenRouter, DeepSeek, Doubao, Kimi — configure in config.yaml.
FAQ
Q: What is DeerFlow? A: ByteDance's open-source SuperAgent framework that orchestrates sub-agents, sandboxes, and memory for complex tasks spanning minutes to hours. 58,300+ stars, MIT license.
Q: Is it free? A: Yes, MIT license. You provide your own LLM API keys.
Q: How does it compare to CrewAI or AutoGen? A: DeerFlow focuses on long-horizon tasks with sandbox execution, persistent memory, and multi-channel integration. CrewAI/AutoGen focus more on multi-agent conversation patterns.