LobeChat — Open-Source Multi-Model Chat UI
Beautiful open-source chat UI supporting Claude, GPT-4, Gemini, Ollama, and 50+ providers. Plugin system, knowledge base, TTS, image generation, and self-hostable. 55,000+ GitHub stars.
What it is
LobeChat is a beautiful open-source chat interface that supports Claude, GPT-4, Gemini, Ollama, and 50+ other model providers. It provides a unified chat experience with features like plugin support, vision (image understanding), text-to-speech, knowledge base integration, and multi-modal conversations. The application can be self-hosted or deployed to Vercel with one click.
Developers, teams, and individuals who want a polished chat UI that works with multiple AI providers without vendor lock-in benefit most. LobeChat eliminates the need to switch between different provider web interfaces.
How it saves time or tokens
LobeChat centralizes all model interactions in one interface. Instead of maintaining accounts and switching tabs between ChatGPT, Claude, and Gemini, you configure all providers once and switch models mid-conversation. The plugin ecosystem extends the chat with web search, code execution, and file handling without building custom integrations. Self-hosting gives you full control over data retention and avoids per-seat pricing from commercial chat products.
How to use
- Deploy LobeChat with Docker:
docker run -d -p 3210:3210 lobehub/lobe-chat
- Open
http://localhost:3210in your browser and configure your model providers in Settings:
OpenAI API Key: sk-...
Anthropic API Key: sk-ant-...
Ollama endpoint: http://localhost:11434
- Start chatting. Select your model from the dropdown and use features like vision, plugins, and knowledge base from the sidebar.
Example
# One-click deploy to Vercel
npx create-lobe-chat@latest
# Docker Compose with database persistence
version: '3'
services:
lobe-chat:
image: lobehub/lobe-chat
ports:
- '3210:3210'
environment:
- OPENAI_API_KEY=sk-...
- ANTHROPIC_API_KEY=sk-ant-...
volumes:
- ./data:/app/data
Related on TokRepo
- AI Tools for Coding -- Developer tools and AI interfaces
- Local LLM with Ollama -- Run local models that integrate with LobeChat
Common pitfalls
- Each model provider requires its own API key configured separately. LobeChat does not include API credits -- you bring your own keys and pay each provider directly.
- Self-hosted deployments store chat history locally by default. Configure a database backend (Postgres) for production use to avoid data loss on container restart.
- Plugin availability varies. Not all plugins work with all model providers. Check plugin compatibility with your chosen model before relying on it in workflows.
Frequently Asked Questions
Yes. LobeChat is open source under the MIT license. The software is free to use, modify, and deploy. You pay only for the API usage of the model providers you configure (OpenAI, Anthropic, etc.).
Yes. LobeChat integrates with Ollama and other local model servers. Point LobeChat to your local Ollama endpoint and use models like Llama, Mistral, or Phi without sending data to external APIs.
Yes. Configure your Anthropic API key in LobeChat settings. You can then select Claude models (Sonnet, Opus, Haiku) from the model dropdown and use them with all of LobeChat's features including vision and plugins.
LobeChat supports vision-capable models. Upload an image in the chat and select a model that supports vision (GPT-4 Vision, Claude with vision). The image is sent as part of the prompt and the model analyzes it in context.
Yes with the server deployment mode. LobeChat supports multi-user setups with individual accounts, conversation histories, and settings. This requires a database backend (Postgres) for user data persistence.
Citations (3)
- LobeChat GitHub Repository— LobeChat supports 50+ model providers
- LobeChat Documentation— Plugin ecosystem and multi-modal support
- LobeChat Official Website— One-click Vercel deployment and Docker support
Related on TokRepo
Source & Thanks
Discussion
Related Assets
Conda — Cross-Platform Package and Environment Manager
Install, update, and manage packages and isolated environments for Python, R, C/C++, and hundreds of other languages from a single tool.
Sphinx — Python Documentation Generator
Generate professional documentation from reStructuredText and Markdown with cross-references, API autodoc, and multiple output formats.
Neutralinojs — Lightweight Cross-Platform Desktop Apps
Build desktop applications with HTML, CSS, and JavaScript using a tiny native runtime instead of bundling Chromium.