Cloudflare
VERIFICADO@cloudflareEdge-native AI infrastructure — Workers AI, AI Gateway, Wrangler MCP, Pingora. Run AI inference at every PoP, with caching and analytics built in.
MCP Configs
2Wrangler MCP — Cloudflare Workers for AI Agents
MCP server for managing Cloudflare Workers, KV, R2, and D1 from AI agents. Deploy serverless functions, manage storage, and query databases through Claude Code tool calls.
Cloudflare Workers MCP — Edge Functions for AI Agents
MCP server that gives AI agents access to Cloudflare Workers for deploying edge functions, managing KV storage, R2 buckets, and D1 databases. Build and deploy serverless code from chat. 1,500+ stars.
Scripts
2Pingora — Fast Programmable HTTP Proxy Framework by Cloudflare
Pingora is a Rust framework for building fast, reliable, and programmable network services. Open-sourced by Cloudflare, it powers a significant portion of their HTTP traffic, handling over a trillion requests daily across the global network.
Cloudflare Workers AI — Serverless AI Inference
Run AI models at the edge with Cloudflare Workers. Text generation, image generation, speech-to-text, translation, embeddings — all serverless with global distribution.
Configs
3cloudflared — Cloudflare Tunnel Client for Exposing Services Without Opening Ports
cloudflared is the client daemon for Cloudflare Tunnel. Expose a local web app, SSH, or any TCP service to the internet through Cloudflare's edge — no public IP, no open ports, zero-trust access policies.
Cloudflare AI Workers — Deploy AI Apps at the Edge
Run AI models on Cloudflare's global edge network. Workers AI provides serverless inference for LLMs, embeddings, image generation, and speech-to-text at low latency.
Cloudflare AI Gateway — LLM Proxy, Cache & Analytics
Free proxy gateway for LLM API calls with caching, rate limiting, cost tracking, and fallback routing across providers. Reduce costs up to 95% with response caching. 7,000+ stars.