bolt.diy — AI Full-Stack App Builder, Any LLM
Community fork of Bolt.new. Prompt, edit, and deploy full-stack web apps with any LLM provider. 19K+ GitHub stars.
What it is
bolt.diy is an open-source community fork of Bolt.new that provides an AI-powered full-stack web app builder. You describe what you want in natural language, and bolt.diy generates a complete web application -- frontend, backend, and deployment configuration. Unlike the original Bolt.new, bolt.diy supports any LLM provider including OpenAI, Anthropic, Google, Ollama, and many others.
It targets developers and non-developers who want to build web applications quickly using AI, especially those who want to use their own API keys and avoid vendor lock-in.
How it saves time or tokens
bolt.diy collapses the typical app development workflow into a conversation. Instead of writing boilerplate code, configuring build tools, and setting up a dev server, you describe what you want and get a working app with live preview. The multi-provider support lets you choose the cheapest or fastest model for your task. You can iterate by asking for changes in natural language rather than editing code manually.
How to use
- Clone and install:
git clone https://github.com/stackblitz-labs/bolt.diy.git
cd bolt.diy
npm install -g pnpm
pnpm install
- Configure your LLM provider in
.env.local:
OPENAI_API_KEY=sk-...
# Or: ANTHROPIC_API_KEY=sk-ant-...
# Or: GOOGLE_GENERATIVE_AI_API_KEY=...
- Start the app and begin building:
pnpm run dev
# Open http://localhost:5173
# Type: 'Build a todo app with dark mode and local storage'
Example
# Start bolt.diy locally
pnpm run dev
# In the chat interface, try these prompts:
# 'Build a dashboard that shows real-time cryptocurrency prices'
# 'Add a search filter and sort by market cap'
# 'Make it responsive for mobile devices'
# 'Add a dark/light theme toggle'
# bolt.diy generates the full app, shows a live preview,
# and lets you download or deploy the result.
Related on TokRepo
- AI tools for coding -- More AI-powered development tools
- AI tools for no-code -- No-code and low-code AI tools
Common pitfalls
- bolt.diy generates code in a WebContainer environment. Complex backend logic (databases, server processes) may not work in the preview. Export the project for production deployment.
- Model quality affects output quality. GPT-4o and Claude Sonnet produce better results for complex apps. Cheaper models may produce incomplete or buggy code.
- Generated apps use React and Vite by default. If you need a different framework, specify it explicitly in your prompt.
Frequently Asked Questions
bolt.diy supports OpenAI, Anthropic Claude, Google Gemini, Ollama (local models), Mistral, Groq, Together AI, and several other providers. You configure your preferred provider by setting the appropriate API key in the .env.local file. You can switch between providers without changing your workflow.
bolt.diy is a community fork of Bolt.new with several additions: support for any LLM provider (not just the default), the ability to self-host, and community-contributed features. Bolt.new is the commercial product by StackBlitz. bolt.diy gives you full control over which model you use and how you host it.
Yes. You can export the generated project as a standard web application and deploy it anywhere -- Vercel, Netlify, Cloudflare Pages, or your own server. The generated code is standard React with Vite, so it works with any hosting platform that supports static or Node.js deployments.
Yes. Connect bolt.diy to Ollama or any OpenAI-compatible local model server. Set the appropriate API URL and model name in your configuration. Local models avoid API costs but may produce lower quality output for complex app generation tasks.
Yes. bolt.diy provides a code editor alongside the chat and preview panels. You can manually edit any file, and the changes reflect immediately in the preview. You can also ask the AI to modify specific files or components through the chat interface.
Citations (3)
- bolt.diy GitHub Repository— bolt.diy is a community fork of Bolt.new supporting multiple LLM providers
- StackBlitz WebContainers— WebContainers enable running Node.js in the browser
- Anthropic Claude Documentation— Claude models support code generation and editing tasks
Related on TokRepo
Source & Thanks
Created by StackBlitz Labs and community. Licensed under MIT.
bolt.diy — ⭐ 19,200+
Discussion
Related Assets
Moodle — Open-Source Learning Management System
The most widely used open-source learning platform, providing course management, assessments, and collaboration tools for educators and organizations worldwide.
Sylius — Headless E-Commerce Framework on Symfony
An open-source headless e-commerce platform built on Symfony and API Platform, designed for developers who need a customizable and API-first commerce solution.
Akaunting — Free Self-Hosted Accounting Software
A free, open-source online accounting application built on Laravel for small businesses and freelancers to manage invoices, expenses, and financial reports.