Esta página se muestra en inglés. Una traducción al español está en curso.
WorkflowsMay 7, 2026·3 min de lectura

Mistral Agents API — Built-In Tools, Memory & Search

Mistral's Agents API gives any model built-in tools (web search, code interpreter, MCP), persistent memory, and multi-agent handoff via one endpoint.

Listo para agents

Este activo puede ser leído e instalado directamente por agents

TokRepo expone un comando CLI universal, contrato de instalación, metadata JSON, plan según adaptador y contenido raw para que los agents evalúen compatibilidad, riesgo y próximos pasos.

Stage only · 17/100Stage only
Superficie agent
Cualquier agent MCP/CLI
Tipo
Skill
Instalación
Stage only
Confianza
Confianza: New
Entrada
Asset
Comando CLI universal
npx tokrepo install b3642f2b-349a-4f09-b16b-8c54dac71705
Introducción

The Mistral Agents API is a single endpoint that wraps a Mistral model with built-in tools — web search, code interpreter, image generation, document library, and MCP server connectors — plus persistent conversation memory and agent handoff. Best for: production agents where you don't want to glue tools yourself. Works with: Mistral API directly, or via the official Mistral SDK in Python / JavaScript. Setup time: 5 minutes (API key + 30 lines of code).


Create an agent

from mistralai import Mistral

client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])

agent = client.beta.agents.create(
    model="mistral-large-latest",
    name="Research Assistant",
    description="Researches topics and writes summaries",
    instructions="You are a research assistant. Use web search aggressively. Cite sources.",
    tools=[
        {"type": "web_search"},
        {"type": "code_interpreter"},
        {"type": "document_library", "library_id": "lib-xyz"},
    ],
)

Run a conversation

conversation = client.beta.conversations.create(
    agent_id=agent.id,
    inputs="Compare the energy density of LFP vs NMC batteries with citations.",
)

# The agent runs the web_search tool, code_interpreter, then synthesizes
print(conversation.outputs[-1].content)
print(conversation.outputs[-1].tool_calls)  # see what tools were used

Multi-agent handoff

researcher = client.beta.agents.create(
    name="Researcher", model="mistral-large-latest",
    tools=[{"type": "web_search"}],
)
writer = client.beta.agents.create(
    name="Writer", model="mistral-large-latest",
    instructions="Polish drafts into newsletter format.",
)

# Hand off mid-conversation
conv = client.beta.conversations.create(
    agent_id=researcher.id,
    inputs="Find 3 papers on retrieval-augmented generation from arxiv this month",
)
# Then transfer to writer
final = client.beta.conversations.append(
    conversation_id=conv.id,
    agent_id=writer.id,
    inputs="Now turn this into a 200-word LinkedIn post",
)

Connect MCP servers

agent = client.beta.agents.create(
    name="Coder",
    model="codestral-latest",
    tools=[
        {
            "type": "mcp",
            "server_url": "https://your-mcp-server.example.com",
        },
    ],
)

Any MCP server (Postgres MCP, GitHub MCP, etc) works. Agents API speaks the protocol natively.


FAQ

Q: Is the Mistral Agents API free? A: Pay-as-you-go via Mistral's API. Pricing varies by model (mistral-large is most expensive, mistral-small cheapest). Free tier on console.mistral.ai for prototyping. See pricing on mistral.ai/pricing.

Q: How does this compare to OpenAI Assistants API? A: Similar architecture (agent + tools + memory) but Mistral adds native MCP support — connect any MCP server as a tool with one config line. OpenAI Assistants only support OpenAI-flavored function calling.

Q: Can I use Codestral with the Agents API? A: Yes. Set model='codestral-latest' when creating the agent. Codestral is the recommended model for code-focused agents (lower latency, cheaper, better on code-specific benchmarks).


Quick Use

  1. Get an API key at console.mistral.ai
  2. pip install mistralai (or npm install @mistralai/mistralai)
  3. Use the agents.create + conversations.create snippet below

Intro

The Mistral Agents API is a single endpoint that wraps a Mistral model with built-in tools — web search, code interpreter, image generation, document library, and MCP server connectors — plus persistent conversation memory and agent handoff. Best for: production agents where you don't want to glue tools yourself. Works with: Mistral API directly, or via the official Mistral SDK in Python / JavaScript. Setup time: 5 minutes (API key + 30 lines of code).


Create an agent

from mistralai import Mistral

client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])

agent = client.beta.agents.create(
    model="mistral-large-latest",
    name="Research Assistant",
    description="Researches topics and writes summaries",
    instructions="You are a research assistant. Use web search aggressively. Cite sources.",
    tools=[
        {"type": "web_search"},
        {"type": "code_interpreter"},
        {"type": "document_library", "library_id": "lib-xyz"},
    ],
)

Run a conversation

conversation = client.beta.conversations.create(
    agent_id=agent.id,
    inputs="Compare the energy density of LFP vs NMC batteries with citations.",
)

# The agent runs the web_search tool, code_interpreter, then synthesizes
print(conversation.outputs[-1].content)
print(conversation.outputs[-1].tool_calls)  # see what tools were used

Multi-agent handoff

researcher = client.beta.agents.create(
    name="Researcher", model="mistral-large-latest",
    tools=[{"type": "web_search"}],
)
writer = client.beta.agents.create(
    name="Writer", model="mistral-large-latest",
    instructions="Polish drafts into newsletter format.",
)

# Hand off mid-conversation
conv = client.beta.conversations.create(
    agent_id=researcher.id,
    inputs="Find 3 papers on retrieval-augmented generation from arxiv this month",
)
# Then transfer to writer
final = client.beta.conversations.append(
    conversation_id=conv.id,
    agent_id=writer.id,
    inputs="Now turn this into a 200-word LinkedIn post",
)

Connect MCP servers

agent = client.beta.agents.create(
    name="Coder",
    model="codestral-latest",
    tools=[
        {
            "type": "mcp",
            "server_url": "https://your-mcp-server.example.com",
        },
    ],
)

Any MCP server (Postgres MCP, GitHub MCP, etc) works. Agents API speaks the protocol natively.


FAQ

Q: Is the Mistral Agents API free? A: Pay-as-you-go via Mistral's API. Pricing varies by model (mistral-large is most expensive, mistral-small cheapest). Free tier on console.mistral.ai for prototyping. See pricing on mistral.ai/pricing.

Q: How does this compare to OpenAI Assistants API? A: Similar architecture (agent + tools + memory) but Mistral adds native MCP support — connect any MCP server as a tool with one config line. OpenAI Assistants only support OpenAI-flavored function calling.

Q: Can I use Codestral with the Agents API? A: Yes. Set model='codestral-latest' when creating the agent. Codestral is the recommended model for code-focused agents (lower latency, cheaper, better on code-specific benchmarks).


Source & Thanks

Built by Mistral AI. Commercial API.

docs.mistral.ai/agents — Official documentation

🙏

Fuente y agradecimientos

Built by Mistral AI. Commercial API.

docs.mistral.ai/agents — Official documentation

Discusión

Inicia sesión para unirte a la discusión.
Aún no hay comentarios. Sé el primero en compartir tus ideas.

Activos relacionados