LobeChat — Modern AI Chat Framework & Agent Hub
Open-source AI chat framework with multi-agent collaboration, plugin marketplace, TTS, vision, and file upload. Supports 70+ model providers. Self-hostable. 75K+ stars.
What it is
LobeChat is an open-source AI chat framework built with Next.js that provides a polished chat interface, multi-agent collaboration, and a plugin marketplace. It supports over 70 model providers including OpenAI, Anthropic, Google, and local models.
LobeChat targets developers and teams who want a self-hosted chat interface with features like TTS, vision, file upload, and an agent marketplace -- without vendor lock-in to any single LLM provider.
How it saves time or tokens
LobeChat provides a ready-to-use chat UI that would take weeks to build from scratch. The plugin system and agent marketplace let you extend functionality without writing custom integration code. Token usage is managed per-conversation with built-in token counting and model switching.
How to use
- Run LobeChat with Docker:
docker run -d -p 3210:3210 lobehub/lobe-chat
- Open
http://localhost:3210and add your API key.
- Start chatting, install plugins, or browse the agent marketplace.
Example
# Deploy LobeChat with environment variables
docker run -d -p 3210:3210 \
-e OPENAI_API_KEY=sk-your-key \
-e OPENAI_PROXY_URL=https://api.openai.com/v1 \
-e ACCESS_CODE=your-access-code \
lobehub/lobe-chat
# Or deploy to Vercel with one click
# Visit: https://github.com/lobehub/lobe-chat
# Click 'Deploy to Vercel' button
Related on TokRepo
- AI Tools for Agents -- Agent frameworks and orchestration tools
- AI Tools for Content -- Content generation and chat interfaces
Common pitfalls
- LobeChat stores data in the browser by default (IndexedDB). For persistent server-side storage, configure a PostgreSQL database.
- The Docker image exposes port 3210. If running behind a reverse proxy, ensure WebSocket connections are forwarded correctly for real-time features.
- Some plugins require their own API keys or backend services. Check plugin documentation before installing to avoid broken integrations.
Frequently Asked Questions
LobeChat supports over 70 model providers including OpenAI, Anthropic, Google, Azure OpenAI, AWS Bedrock, Ollama, and many others. You can switch between providers per conversation without changing your setup.
Yes. LobeChat can be deployed via Docker, Vercel, or any Node.js hosting. The Docker image is the simplest option for self-hosting with full control over data and configuration.
Yes. LobeChat integrates with Ollama for local model inference. Configure the Ollama endpoint in settings and select local models in the model picker.
The agent marketplace is a community-curated collection of pre-configured AI agents with specific system prompts and tool configurations. Users can install agents for tasks like coding, writing, translation, and research.
Yes. LobeChat has a plugin system that extends the chat interface with web search, image generation, code execution, and other capabilities. Plugins are installed from the built-in marketplace.
Citations (3)
- LobeChat GitHub— LobeChat is an open-source AI chat framework with 75K+ GitHub stars
- LobeChat Docs— LobeChat documentation and deployment guides
- OpenAI API Documentation— OpenAI Chat Completions API specification
Related on TokRepo
Source & Thanks
Created by LobeHub. Licensed under MIT. lobehub/lobe-chat — 75,000+ GitHub stars
Discussion
Related Assets
Flower — Federated Learning Framework for Any ML Platform
A unified framework for federated learning and federated analytics that works with PyTorch, TensorFlow, JAX, or any machine learning library.
H2O-3 — Scalable Open-Source Machine Learning Platform
An in-memory distributed machine learning platform with AutoML support, offering gradient boosting, deep learning, GLM, and more through Python, R, and Java APIs.
Open3D — Modern Library for 3D Data Processing
An open-source library for 3D data processing with fast implementations for point clouds, meshes, RGB-D images, and 3D visualization using both C++ and Python APIs.