Hono Features
Multi-Runtime Support
| Runtime | Command |
|---|---|
| Cloudflare Workers | npx wrangler deploy |
| Deno | deno serve app.ts |
| Bun | bun run app.ts |
| Node.js | node app.js |
| AWS Lambda | Deploy via SAM/CDK |
| Vercel Edge | Deploy via Vercel CLI |
Built-in Middleware
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { bearerAuth } from 'hono/bearer-auth';
import { rateLimiter } from 'hono/rate-limiter';
const app = new Hono();
app.use('/api/*', cors());
app.use('/api/*', bearerAuth({ token: 'my-secret-token' }));AI Streaming Responses
import { Hono } from 'hono';
import { streamSSE } from 'hono/streaming';
app.post('/api/stream', async (c) => {
return streamSSE(c, async (stream) => {
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: { 'Authorization': `Bearer ${c.env.OPENAI_API_KEY}` },
body: JSON.stringify({ model: 'gpt-4o', messages: [...], stream: true }),
});
const reader = response.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
await stream.writeSSE({ data: new TextDecoder().decode(value) });
}
});
});OpenAPI Schema Generation
import { OpenAPIHono, createRoute, z } from '@hono/zod-openapi';
const route = createRoute({
method: 'post',
path: '/api/chat',
request: { body: { content: { 'application/json': { schema: z.object({ prompt: z.string() }) } } } },
responses: { 200: { description: 'Chat response' } },
});
app.openapi(route, (c) => { /* handler */ });
app.doc('/docs', { openapi: '3.0.0', info: { title: 'AI API', version: '1.0' } });Performance
- ~14KB bundle size (zero dependencies)
- <1ms cold start on edge runtimes
- 150K+ req/s on Bun benchmarks
- Built on Web Standards (Request/Response)
FAQ
Q: What is Hono? A: Hono is an ultrafast web framework with 29,700+ GitHub stars that runs on Cloudflare Workers, Deno, Bun, Node.js, and more. ~14KB, <1ms cold start, built-in middleware, ideal for AI API endpoints.
Q: Why use Hono for AI services? A: AI APIs need low latency and global distribution. Hono runs at the edge (Cloudflare Workers) with <1ms cold start, has built-in SSE streaming for LLM responses, and includes auth/CORS/rate-limiting middleware out of the box.
Q: Is Hono free? A: Yes, open-source under MIT license.