Quick Use
- Get an API key at console.mistral.ai
pip install mistralai(ornpm install @mistralai/mistralai)- Use the agents.create + conversations.create snippet below
Intro
The Mistral Agents API is a single endpoint that wraps a Mistral model with built-in tools — web search, code interpreter, image generation, document library, and MCP server connectors — plus persistent conversation memory and agent handoff. Best for: production agents where you don't want to glue tools yourself. Works with: Mistral API directly, or via the official Mistral SDK in Python / JavaScript. Setup time: 5 minutes (API key + 30 lines of code).
Create an agent
from mistralai import Mistral
client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])
agent = client.beta.agents.create(
model="mistral-large-latest",
name="Research Assistant",
description="Researches topics and writes summaries",
instructions="You are a research assistant. Use web search aggressively. Cite sources.",
tools=[
{"type": "web_search"},
{"type": "code_interpreter"},
{"type": "document_library", "library_id": "lib-xyz"},
],
)Run a conversation
conversation = client.beta.conversations.create(
agent_id=agent.id,
inputs="Compare the energy density of LFP vs NMC batteries with citations.",
)
# The agent runs the web_search tool, code_interpreter, then synthesizes
print(conversation.outputs[-1].content)
print(conversation.outputs[-1].tool_calls) # see what tools were usedMulti-agent handoff
researcher = client.beta.agents.create(
name="Researcher", model="mistral-large-latest",
tools=[{"type": "web_search"}],
)
writer = client.beta.agents.create(
name="Writer", model="mistral-large-latest",
instructions="Polish drafts into newsletter format.",
)
# Hand off mid-conversation
conv = client.beta.conversations.create(
agent_id=researcher.id,
inputs="Find 3 papers on retrieval-augmented generation from arxiv this month",
)
# Then transfer to writer
final = client.beta.conversations.append(
conversation_id=conv.id,
agent_id=writer.id,
inputs="Now turn this into a 200-word LinkedIn post",
)Connect MCP servers
agent = client.beta.agents.create(
name="Coder",
model="codestral-latest",
tools=[
{
"type": "mcp",
"server_url": "https://your-mcp-server.example.com",
},
],
)Any MCP server (Postgres MCP, GitHub MCP, etc) works. Agents API speaks the protocol natively.
FAQ
Q: Is the Mistral Agents API free? A: Pay-as-you-go via Mistral's API. Pricing varies by model (mistral-large is most expensive, mistral-small cheapest). Free tier on console.mistral.ai for prototyping. See pricing on mistral.ai/pricing.
Q: How does this compare to OpenAI Assistants API? A: Similar architecture (agent + tools + memory) but Mistral adds native MCP support — connect any MCP server as a tool with one config line. OpenAI Assistants only support OpenAI-flavored function calling.
Q: Can I use Codestral with the Agents API?
A: Yes. Set model='codestral-latest' when creating the agent. Codestral is the recommended model for code-focused agents (lower latency, cheaper, better on code-specific benchmarks).
Source & Thanks
Built by Mistral AI. Commercial API.
docs.mistral.ai/agents — Official documentation