Cette page est affichée en anglais. Une traduction française est en cours.
ScriptsApr 2, 2026·2 min de lecture

Qwen-Agent — Build AI Agents on Qwen Models

Agent framework by Alibaba with function calling, code interpreter, RAG, and MCP support. Built for Qwen 3.0+. 15K+ stars.

Introduction

Qwen-Agent is an open-source agent framework by the Alibaba Qwen team, designed to build LLM-powered applications on top of Qwen models (3.0+). With 15,800+ GitHub stars, it provides native function calling, a sandboxed code interpreter running in Docker, RAG pipelines for document understanding, MCP tool discovery, and a built-in Gradio GUI for rapid prototyping. It's the official way to build agents with Qwen — the leading open-weight LLM family from China.

Works with: Qwen models (qwen-max, qwen-plus, qwen-turbo), DashScope API, local Qwen deployments. Best for developers building AI agents on the Qwen ecosystem. Setup time: under 5 minutes.


Qwen-Agent Architecture & Features

Core Components

┌─────────────────────────────────────┐
│           Agent Layer               │
│  AssistantReActChatRouter     │
├─────────────────────────────────────┤
│           Tool Layer                │
│  Function CallingCode Interpreter│
│  Web SearchMCP ToolsCustom    │
├─────────────────────────────────────┤
│           LLM Layer                 │
│  Qwen-MaxQwen-PlusLocal Qwen │
└─────────────────────────────────────┘

Function Calling

Native function calling with automatic schema parsing:

from qwen_agent.tools import BaseTool, register_tool

@register_tool("get_weather")
class WeatherTool(BaseTool):
    description = "Get weather for a city"
    parameters = [{"name": "city", "type": "string", "required": True}]

    def call(self, params, **kwargs):
        city = params["city"]
        return f"Weather in {city}: 25°C, sunny"

Code Interpreter

Executes Python code in an isolated Docker sandbox:

  • Matplotlib/Seaborn chart generation
  • Pandas data analysis
  • File I/O in sandboxed environment
  • Automatic output capture and display

RAG (Retrieval Augmented Generation)

Built-in document understanding pipeline:

  • PDF, DOCX, HTML, Markdown parsing
  • Chunking with configurable overlap
  • Vector similarity search
  • Citation tracking in responses

MCP Support

Connect external tools via Model Context Protocol:

agent = Assistant(
    llm={"model": "qwen-max"},
    mcp_servers=[{
        "url": "http://localhost:8080/mcp",
        "tools": ["search", "calculator"]
    }]
)

Built-in GUI

Launch a Gradio chat interface instantly:

from qwen_agent.gui import WebUI

WebUI(agent).run(server_port=7860)

Multi-Agent Orchestration

Route tasks to specialized agents:

from qwen_agent.agents import GroupChat, Router

agents = [coder_agent, researcher_agent, writer_agent]
router = Router(llm=llm, agents=agents)

FAQ

Q: What is Qwen-Agent? A: Qwen-Agent is an open-source Python framework by Alibaba for building AI agents on Qwen models, featuring function calling, code interpretation, RAG, and MCP support. 15,800+ GitHub stars.

Q: Can I use Qwen-Agent with non-Qwen models? A: It's optimized for Qwen models but supports any OpenAI-compatible API endpoint. Best results come from Qwen-Max and Qwen-Plus which have native function calling.

Q: Is Qwen-Agent free? A: Yes, Apache-2.0 licensed. The Qwen models themselves have varying licenses — Qwen 3.0 is Apache-2.0 for most sizes.


🙏

Source et remerciements

Created by QwenLM (Alibaba Qwen Team). Licensed under Apache-2.0.

Qwen-Agent — ⭐ 15,800+

Thanks to the Qwen team at Alibaba for open-sourcing both the models and the agent framework.

Discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires