# Hono — Ultrafast Web Framework for the Edge > Tiny, fast web framework for Cloudflare Workers, Deno, Bun, and Node.js. Perfect API layer for AI services. 29K+ GitHub stars. ## Install Save as a script file and run: # Hono — Ultrafast Web Framework for the Edge ## Quick Use ```bash npm create hono@latest my-api cd my-api && npm install && npm run dev ``` ```typescript import { Hono } from 'hono'; const app = new Hono(); app.get('/', (c) => c.text('Hello Hono!')); app.post('/api/chat', async (c) => { const { prompt } = await c.req.json(); const response = await fetch('https://api.openai.com/v1/chat/completions', { method: 'POST', headers: { 'Authorization': `Bearer ${c.env.OPENAI_API_KEY}`, 'Content-Type': 'application/json', }, body: JSON.stringify({ model: 'gpt-4o', messages: [{ role: 'user', content: prompt }], }), }); return c.json(await response.json()); }); export default app; ``` Deploy to Cloudflare Workers: ```bash npx wrangler deploy ``` --- ## Intro Hono is an ultrafast, lightweight web framework with 29,700+ GitHub stars built on Web Standards. It runs everywhere — Cloudflare Workers, Deno, Bun, Node.js, AWS Lambda, Vercel — making it the ideal API layer for AI services that need low latency and global distribution. With zero dependencies, built-in middleware (CORS, auth, rate limiting), OpenAPI schema generation, and RPC-style type-safe clients, Hono lets you build production AI APIs in minutes. At ~14KB, it starts in under 1ms on edge runtimes. Works with: Cloudflare Workers, Deno, Bun, Node.js, AWS Lambda, Vercel Edge, any OpenAI/Anthropic API. Best for developers building low-latency AI API endpoints at the edge. Setup time: under 2 minutes. --- ## Hono Features ### Multi-Runtime Support | Runtime | Command | |---------|--------| | **Cloudflare Workers** | `npx wrangler deploy` | | **Deno** | `deno serve app.ts` | | **Bun** | `bun run app.ts` | | **Node.js** | `node app.js` | | **AWS Lambda** | Deploy via SAM/CDK | | **Vercel Edge** | Deploy via Vercel CLI | ### Built-in Middleware ```typescript import { Hono } from 'hono'; import { cors } from 'hono/cors'; import { bearerAuth } from 'hono/bearer-auth'; import { rateLimiter } from 'hono/rate-limiter'; const app = new Hono(); app.use('/api/*', cors()); app.use('/api/*', bearerAuth({ token: 'my-secret-token' })); ``` ### AI Streaming Responses ```typescript import { Hono } from 'hono'; import { streamSSE } from 'hono/streaming'; app.post('/api/stream', async (c) => { return streamSSE(c, async (stream) => { const response = await fetch('https://api.openai.com/v1/chat/completions', { method: 'POST', headers: { 'Authorization': `Bearer ${c.env.OPENAI_API_KEY}` }, body: JSON.stringify({ model: 'gpt-4o', messages: [...], stream: true }), }); const reader = response.body.getReader(); while (true) { const { done, value } = await reader.read(); if (done) break; await stream.writeSSE({ data: new TextDecoder().decode(value) }); } }); }); ``` ### OpenAPI Schema Generation ```typescript import { OpenAPIHono, createRoute, z } from '@hono/zod-openapi'; const route = createRoute({ method: 'post', path: '/api/chat', request: { body: { content: { 'application/json': { schema: z.object({ prompt: z.string() }) } } } }, responses: { 200: { description: 'Chat response' } }, }); app.openapi(route, (c) => { /* handler */ }); app.doc('/docs', { openapi: '3.0.0', info: { title: 'AI API', version: '1.0' } }); ``` ### Performance - **~14KB** bundle size (zero dependencies) - **<1ms** cold start on edge runtimes - **150K+ req/s** on Bun benchmarks - Built on Web Standards (Request/Response) --- ## FAQ **Q: What is Hono?** A: Hono is an ultrafast web framework with 29,700+ GitHub stars that runs on Cloudflare Workers, Deno, Bun, Node.js, and more. ~14KB, <1ms cold start, built-in middleware, ideal for AI API endpoints. **Q: Why use Hono for AI services?** A: AI APIs need low latency and global distribution. Hono runs at the edge (Cloudflare Workers) with <1ms cold start, has built-in SSE streaming for LLM responses, and includes auth/CORS/rate-limiting middleware out of the box. **Q: Is Hono free?** A: Yes, open-source under MIT license. --- ## Source & Thanks > Created by [Yusuke Wada](https://github.com/honojs). Licensed under MIT. > > [hono](https://github.com/honojs/hono) — ⭐ 29,700+ --- ## 快速使用 ```bash npm create hono@latest my-api cd my-api && npm run dev ``` --- ## 简介 Hono 是一个拥有 29,700+ GitHub stars 的超快 Web 框架,运行在 Cloudflare Workers、Deno、Bun、Node.js 等多个运行时。~14KB 体积,<1ms 冷启动,内置中间件,是构建 AI API 端点的理想选择。 --- ## 来源与感谢 > Created by [Yusuke Wada](https://github.com/honojs). Licensed under MIT. > > [hono](https://github.com/honojs/hono) — ⭐ 29,700+ --- Source: https://tokrepo.com/en/workflows/ff72c952-86a4-402d-b288-7468b3a3f4c5 Author: Script Depot