Cloudflare Workers MCP — Edge Functions for AI Agents
MCP server that gives AI agents access to Cloudflare Workers for deploying edge functions, managing KV storage, R2 buckets, and D1 databases. Build and deploy serverless code from chat. 1,500+ stars.
What it is
Cloudflare Workers MCP is a Model Context Protocol server that connects AI coding agents to the Cloudflare developer platform. It lets agents deploy edge functions, manage KV key-value storage, R2 object storage, and D1 SQLite databases through natural language commands.
This tool is designed for developers already using Cloudflare who want AI-assisted serverless deployment. It works with Claude Code, Cursor, and any MCP-compatible client. Setup takes under two minutes.
How it saves time or tokens
Without this MCP server, deploying a Cloudflare Worker requires switching between the Wrangler CLI, the Cloudflare dashboard, and your code editor. The MCP server collapses that workflow into a single conversation. Your agent writes the function, deploys it, and configures storage in one pass. The workflow estimates around 2,400 tokens per session, covering deployment, storage configuration, and verification steps.
How to use
- Get an API token from dash.cloudflare.com/profile/api-tokens with Workers and Storage permissions.
- Add the MCP server to your
.mcp.jsonconfiguration file. - Restart your MCP client and start issuing natural language commands.
{
"mcpServers": {
"cloudflare": {
"command": "npx",
"args": ["-y", "@cloudflare/mcp-server-cloudflare"],
"env": {
"CLOUDFLARE_API_TOKEN": "your-token"
}
}
}
}
Example
Once configured, you can issue commands like these to your AI agent:
'Deploy a new worker that returns the current time in JSON'
'Store user preferences in KV namespace settings'
'List all objects in the backups R2 bucket'
'Create a D1 database called analytics and add a pageviews table'
The agent translates these into Cloudflare API calls, executes them, and reports results back in the conversation.
Related on TokRepo
- MCP Integrations — Browse other MCP server integrations for AI agents
- AI Tools for DevOps — More developer operations tools for AI-assisted workflows
Common pitfalls
- API tokens need explicit permissions for each service (Workers, KV, R2, D1). A token missing R2 permissions will silently fail on bucket operations.
- The MCP server runs
npxon every invocation, which adds cold-start latency. Pin the package version to avoid unexpected updates. - D1 databases have a 500MB size limit on the free plan. Large datasets should use R2 or external storage instead.
Frequently Asked Questions
The MCP server supports Cloudflare Workers (deploy and manage edge functions), KV (key-value storage), R2 (object storage similar to S3), and D1 (SQLite databases at the edge). Each service requires its own API token permissions.
No. Cloudflare Workers, KV, R2, and D1 all have free tiers. The MCP server itself is open source. You only need a Cloudflare account and an API token with the right permissions.
Any MCP-compatible client works, including Claude Code, Cursor, and other tools that support the Model Context Protocol. The server communicates over the standard MCP transport layer.
Wrangler requires you to write commands manually and switch between terminal and editor. The MCP server lets your AI agent handle the entire workflow in a single conversation, translating natural language into Cloudflare API calls automatically.
Yes. You can ask the agent to read, modify, and redeploy existing Workers. The server exposes list, get, create, update, and delete operations for Workers along with their bindings to KV, R2, and D1 resources.
Citations (3)
- Cloudflare MCP GitHub— Cloudflare Workers MCP server with 1,500+ stars
- MCP Official Docs— Model Context Protocol specification for AI tool integration
- Cloudflare Workers Docs— Cloudflare Workers serverless platform documentation
Related on TokRepo
Source & Thanks
Created by Cloudflare. Licensed under MIT.
mcp-server-cloudflare — stars 1,500+