Cette page est affichée en anglais. Une traduction française est en cours.
WorkflowsMay 7, 2026·3 min de lecture

Mistral Agents API — Built-In Tools, Memory & Search

Mistral's Agents API gives any model built-in tools (web search, code interpreter, MCP), persistent memory, and multi-agent handoff via one endpoint.

Prêt pour agents

Cet actif peut être lu et installé directement par les agents

TokRepo expose une commande CLI universelle, un contrat d'installation, le metadata JSON, un plan selon l'adaptateur et le contenu raw pour aider les agents à juger l'adaptation, le risque et les prochaines actions.

Stage only · 17/100Stage only
Surface agent
Tout agent MCP/CLI
Type
Skill
Installation
Stage only
Confiance
Confiance : New
Point d'entrée
Asset
Commande CLI universelle
npx tokrepo install b3642f2b-349a-4f09-b16b-8c54dac71705
Introduction

The Mistral Agents API is a single endpoint that wraps a Mistral model with built-in tools — web search, code interpreter, image generation, document library, and MCP server connectors — plus persistent conversation memory and agent handoff. Best for: production agents where you don't want to glue tools yourself. Works with: Mistral API directly, or via the official Mistral SDK in Python / JavaScript. Setup time: 5 minutes (API key + 30 lines of code).


Create an agent

from mistralai import Mistral

client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])

agent = client.beta.agents.create(
    model="mistral-large-latest",
    name="Research Assistant",
    description="Researches topics and writes summaries",
    instructions="You are a research assistant. Use web search aggressively. Cite sources.",
    tools=[
        {"type": "web_search"},
        {"type": "code_interpreter"},
        {"type": "document_library", "library_id": "lib-xyz"},
    ],
)

Run a conversation

conversation = client.beta.conversations.create(
    agent_id=agent.id,
    inputs="Compare the energy density of LFP vs NMC batteries with citations.",
)

# The agent runs the web_search tool, code_interpreter, then synthesizes
print(conversation.outputs[-1].content)
print(conversation.outputs[-1].tool_calls)  # see what tools were used

Multi-agent handoff

researcher = client.beta.agents.create(
    name="Researcher", model="mistral-large-latest",
    tools=[{"type": "web_search"}],
)
writer = client.beta.agents.create(
    name="Writer", model="mistral-large-latest",
    instructions="Polish drafts into newsletter format.",
)

# Hand off mid-conversation
conv = client.beta.conversations.create(
    agent_id=researcher.id,
    inputs="Find 3 papers on retrieval-augmented generation from arxiv this month",
)
# Then transfer to writer
final = client.beta.conversations.append(
    conversation_id=conv.id,
    agent_id=writer.id,
    inputs="Now turn this into a 200-word LinkedIn post",
)

Connect MCP servers

agent = client.beta.agents.create(
    name="Coder",
    model="codestral-latest",
    tools=[
        {
            "type": "mcp",
            "server_url": "https://your-mcp-server.example.com",
        },
    ],
)

Any MCP server (Postgres MCP, GitHub MCP, etc) works. Agents API speaks the protocol natively.


FAQ

Q: Is the Mistral Agents API free? A: Pay-as-you-go via Mistral's API. Pricing varies by model (mistral-large is most expensive, mistral-small cheapest). Free tier on console.mistral.ai for prototyping. See pricing on mistral.ai/pricing.

Q: How does this compare to OpenAI Assistants API? A: Similar architecture (agent + tools + memory) but Mistral adds native MCP support — connect any MCP server as a tool with one config line. OpenAI Assistants only support OpenAI-flavored function calling.

Q: Can I use Codestral with the Agents API? A: Yes. Set model='codestral-latest' when creating the agent. Codestral is the recommended model for code-focused agents (lower latency, cheaper, better on code-specific benchmarks).


Quick Use

  1. Get an API key at console.mistral.ai
  2. pip install mistralai (or npm install @mistralai/mistralai)
  3. Use the agents.create + conversations.create snippet below

Intro

The Mistral Agents API is a single endpoint that wraps a Mistral model with built-in tools — web search, code interpreter, image generation, document library, and MCP server connectors — plus persistent conversation memory and agent handoff. Best for: production agents where you don't want to glue tools yourself. Works with: Mistral API directly, or via the official Mistral SDK in Python / JavaScript. Setup time: 5 minutes (API key + 30 lines of code).


Create an agent

from mistralai import Mistral

client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])

agent = client.beta.agents.create(
    model="mistral-large-latest",
    name="Research Assistant",
    description="Researches topics and writes summaries",
    instructions="You are a research assistant. Use web search aggressively. Cite sources.",
    tools=[
        {"type": "web_search"},
        {"type": "code_interpreter"},
        {"type": "document_library", "library_id": "lib-xyz"},
    ],
)

Run a conversation

conversation = client.beta.conversations.create(
    agent_id=agent.id,
    inputs="Compare the energy density of LFP vs NMC batteries with citations.",
)

# The agent runs the web_search tool, code_interpreter, then synthesizes
print(conversation.outputs[-1].content)
print(conversation.outputs[-1].tool_calls)  # see what tools were used

Multi-agent handoff

researcher = client.beta.agents.create(
    name="Researcher", model="mistral-large-latest",
    tools=[{"type": "web_search"}],
)
writer = client.beta.agents.create(
    name="Writer", model="mistral-large-latest",
    instructions="Polish drafts into newsletter format.",
)

# Hand off mid-conversation
conv = client.beta.conversations.create(
    agent_id=researcher.id,
    inputs="Find 3 papers on retrieval-augmented generation from arxiv this month",
)
# Then transfer to writer
final = client.beta.conversations.append(
    conversation_id=conv.id,
    agent_id=writer.id,
    inputs="Now turn this into a 200-word LinkedIn post",
)

Connect MCP servers

agent = client.beta.agents.create(
    name="Coder",
    model="codestral-latest",
    tools=[
        {
            "type": "mcp",
            "server_url": "https://your-mcp-server.example.com",
        },
    ],
)

Any MCP server (Postgres MCP, GitHub MCP, etc) works. Agents API speaks the protocol natively.


FAQ

Q: Is the Mistral Agents API free? A: Pay-as-you-go via Mistral's API. Pricing varies by model (mistral-large is most expensive, mistral-small cheapest). Free tier on console.mistral.ai for prototyping. See pricing on mistral.ai/pricing.

Q: How does this compare to OpenAI Assistants API? A: Similar architecture (agent + tools + memory) but Mistral adds native MCP support — connect any MCP server as a tool with one config line. OpenAI Assistants only support OpenAI-flavored function calling.

Q: Can I use Codestral with the Agents API? A: Yes. Set model='codestral-latest' when creating the agent. Codestral is the recommended model for code-focused agents (lower latency, cheaper, better on code-specific benchmarks).


Source & Thanks

Built by Mistral AI. Commercial API.

docs.mistral.ai/agents — Official documentation

🙏

Source et remerciements

Built by Mistral AI. Commercial API.

docs.mistral.ai/agents — Official documentation

Fil de discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires