bolt.diy Features
Multi-Provider LLM Support
| Provider | Models |
|---|---|
| OpenAI | GPT-4o, GPT-4o-mini |
| Anthropic | Claude Sonnet, Claude Haiku |
| Gemini Pro, Gemini Flash | |
| Groq | Llama 3, Mixtral (fast) |
| Mistral | Mistral Large, Codestral |
| Ollama | Any local model |
| OpenRouter | 100+ models |
| Together | Open-source models |
| DeepSeek | DeepSeek Coder |
How It Works
You: "Build a todo app with React, Tailwind, and local storage"
bolt.diy:
1. Generates project structure
2. Writes all component files
3. Adds styling with Tailwind
4. Implements localStorage persistence
5. Shows live preview in browser
6. You can edit code or prompt for changesIn-Browser Development Environment
- Code editor with syntax highlighting and file tree
- Live preview of your app updating in real-time
- Terminal for running commands and installing packages
- Version history to undo/redo AI changes
- File management with drag-and-drop
Iterative Development
You: "Add a dark mode toggle"
bolt.diy: [modifies existing components, adds ThemeContext, updates styles]
You: "The button is too small on mobile"
bolt.diy: [adjusts responsive styles]
You: "Deploy this to Netlify"
bolt.diy: [generates build, deploys]One-Click Deployment
Deploy your app directly from the browser:
- Netlify
- Vercel
- Cloudflare Pages
- Download as zip
Desktop App
bolt.diy is also available as an Electron desktop app for offline use.
FAQ
Q: What is bolt.diy? A: bolt.diy is a community open-source fork of Bolt.new with 19,200+ GitHub stars that lets you build full-stack web apps with natural language using any LLM provider. In-browser editor, live preview, and one-click deployment.
Q: How is bolt.diy different from Bolt.new? A: Bolt.new is the original commercial product by StackBlitz. bolt.diy is the community fork that adds support for any LLM provider (not just OpenAI/Anthropic), local models via Ollama, and is fully self-hostable.
Q: Is bolt.diy free? A: Yes, open-source under MIT license. You bring your own LLM API keys (or use free local models via Ollama).