ScriptsApr 3, 2026·2 min read

Hono — Ultrafast Web Framework for the Edge

Tiny, fast web framework for Cloudflare Workers, Deno, Bun, and Node.js. Perfect API layer for AI services. 29K+ GitHub stars.

SC
Script Depot · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

npm create hono@latest my-api
cd my-api && npm install && npm run dev
import { Hono } from 'hono';

const app = new Hono();

app.get('/', (c) => c.text('Hello Hono!'));

app.post('/api/chat', async (c) => {
    const { prompt } = await c.req.json();
    const response = await fetch('https://api.openai.com/v1/chat/completions', {
        method: 'POST',
        headers: {
            'Authorization': `Bearer ${c.env.OPENAI_API_KEY}`,
            'Content-Type': 'application/json',
        },
        body: JSON.stringify({
            model: 'gpt-4o',
            messages: [{ role: 'user', content: prompt }],
        }),
    });
    return c.json(await response.json());
});

export default app;

Deploy to Cloudflare Workers:

npx wrangler deploy

Intro

Hono is an ultrafast, lightweight web framework with 29,700+ GitHub stars built on Web Standards. It runs everywhere — Cloudflare Workers, Deno, Bun, Node.js, AWS Lambda, Vercel — making it the ideal API layer for AI services that need low latency and global distribution. With zero dependencies, built-in middleware (CORS, auth, rate limiting), OpenAPI schema generation, and RPC-style type-safe clients, Hono lets you build production AI APIs in minutes. At ~14KB, it starts in under 1ms on edge runtimes.

Works with: Cloudflare Workers, Deno, Bun, Node.js, AWS Lambda, Vercel Edge, any OpenAI/Anthropic API. Best for developers building low-latency AI API endpoints at the edge. Setup time: under 2 minutes.


Hono Features

Multi-Runtime Support

Runtime Command
Cloudflare Workers npx wrangler deploy
Deno deno serve app.ts
Bun bun run app.ts
Node.js node app.js
AWS Lambda Deploy via SAM/CDK
Vercel Edge Deploy via Vercel CLI

Built-in Middleware

import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { bearerAuth } from 'hono/bearer-auth';
import { rateLimiter } from 'hono/rate-limiter';

const app = new Hono();

app.use('/api/*', cors());
app.use('/api/*', bearerAuth({ token: 'my-secret-token' }));

AI Streaming Responses

import { Hono } from 'hono';
import { streamSSE } from 'hono/streaming';

app.post('/api/stream', async (c) => {
    return streamSSE(c, async (stream) => {
        const response = await fetch('https://api.openai.com/v1/chat/completions', {
            method: 'POST',
            headers: { 'Authorization': `Bearer ${c.env.OPENAI_API_KEY}` },
            body: JSON.stringify({ model: 'gpt-4o', messages: [...], stream: true }),
        });

        const reader = response.body.getReader();
        while (true) {
            const { done, value } = await reader.read();
            if (done) break;
            await stream.writeSSE({ data: new TextDecoder().decode(value) });
        }
    });
});

OpenAPI Schema Generation

import { OpenAPIHono, createRoute, z } from '@hono/zod-openapi';

const route = createRoute({
    method: 'post',
    path: '/api/chat',
    request: { body: { content: { 'application/json': { schema: z.object({ prompt: z.string() }) } } } },
    responses: { 200: { description: 'Chat response' } },
});

app.openapi(route, (c) => { /* handler */ });
app.doc('/docs', { openapi: '3.0.0', info: { title: 'AI API', version: '1.0' } });

Performance

  • ~14KB bundle size (zero dependencies)
  • <1ms cold start on edge runtimes
  • 150K+ req/s on Bun benchmarks
  • Built on Web Standards (Request/Response)

FAQ

Q: What is Hono? A: Hono is an ultrafast web framework with 29,700+ GitHub stars that runs on Cloudflare Workers, Deno, Bun, Node.js, and more. ~14KB, <1ms cold start, built-in middleware, ideal for AI API endpoints.

Q: Why use Hono for AI services? A: AI APIs need low latency and global distribution. Hono runs at the edge (Cloudflare Workers) with <1ms cold start, has built-in SSE streaming for LLM responses, and includes auth/CORS/rate-limiting middleware out of the box.

Q: Is Hono free? A: Yes, open-source under MIT license.


🙏

Source & Thanks

Created by Yusuke Wada. Licensed under MIT.

hono — ⭐ 29,700+

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets