ScriptsApr 2, 2026·3 min read

bolt.diy — AI Full-Stack App Builder, Any LLM

Community fork of Bolt.new. Prompt, edit, and deploy full-stack web apps with any LLM provider. 19K+ GitHub stars.

SC
Script Depot · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

git clone https://github.com/stackblitz-labs/bolt.diy.git
cd bolt.diy
npm install -g pnpm
pnpm install

Configure your LLM provider in .env.local:

OPENAI_API_KEY=sk-...
# Or: ANTHROPIC_API_KEY=sk-ant-...
# Or: GOOGLE_GENERATIVE_AI_API_KEY=...
# Or: OLLAMA_API_BASE_URL=http://localhost:11434
pnpm run dev

Open http://localhost:5173 — start building apps with natural language.


Intro

bolt.diy is a community-driven open-source fork of Bolt.new with 19,200+ GitHub stars that lets you build, edit, and deploy full-stack web applications entirely through natural language prompts. Unlike the original Bolt.new which locks you into specific providers, bolt.diy works with any LLM — OpenAI, Anthropic Claude, Google Gemini, Groq, Mistral, local models via Ollama, and many more. It provides an in-browser code editor, live preview, terminal, and one-click deployment. Build complete React/Next.js/Vue apps from a text description in minutes.

Works with: OpenAI, Anthropic Claude, Google Gemini, Groq, Mistral, Ollama (local), OpenRouter, any OpenAI-compatible API. Best for developers who want an AI app builder with full LLM provider freedom. Setup time: under 5 minutes.


bolt.diy Features

Multi-Provider LLM Support

Provider Models
OpenAI GPT-4o, GPT-4o-mini
Anthropic Claude Sonnet, Claude Haiku
Google Gemini Pro, Gemini Flash
Groq Llama 3, Mixtral (fast)
Mistral Mistral Large, Codestral
Ollama Any local model
OpenRouter 100+ models
Together Open-source models
DeepSeek DeepSeek Coder

How It Works

You: "Build a todo app with React, Tailwind, and local storage"

bolt.diy:
1. Generates project structure
2. Writes all component files
3. Adds styling with Tailwind
4. Implements localStorage persistence
5. Shows live preview in browser
6. You can edit code or prompt for changes

In-Browser Development Environment

  • Code editor with syntax highlighting and file tree
  • Live preview of your app updating in real-time
  • Terminal for running commands and installing packages
  • Version history to undo/redo AI changes
  • File management with drag-and-drop

Iterative Development

You: "Add a dark mode toggle"

bolt.diy: [modifies existing components, adds ThemeContext, updates styles]

You: "The button is too small on mobile"

bolt.diy: [adjusts responsive styles]

You: "Deploy this to Netlify"

bolt.diy: [generates build, deploys]

One-Click Deployment

Deploy your app directly from the browser:

  • Netlify
  • Vercel
  • Cloudflare Pages
  • Download as zip

Desktop App

bolt.diy is also available as an Electron desktop app for offline use.


FAQ

Q: What is bolt.diy? A: bolt.diy is a community open-source fork of Bolt.new with 19,200+ GitHub stars that lets you build full-stack web apps with natural language using any LLM provider. In-browser editor, live preview, and one-click deployment.

Q: How is bolt.diy different from Bolt.new? A: Bolt.new is the original commercial product by StackBlitz. bolt.diy is the community fork that adds support for any LLM provider (not just OpenAI/Anthropic), local models via Ollama, and is fully self-hostable.

Q: Is bolt.diy free? A: Yes, open-source under MIT license. You bring your own LLM API keys (or use free local models via Ollama).


🙏

Source & Thanks

Created by StackBlitz Labs and community. Licensed under MIT.

bolt.diy — ⭐ 19,200+

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets