ConfigsMar 31, 2026·2 min read

LobeChat — Modern AI Chat Framework & Agent Hub

Open-source AI chat framework with multi-agent collaboration, plugin marketplace, TTS, vision, and file upload. Supports 70+ model providers. Self-hostable. 75K+ stars.

TL;DR
LobeChat is a self-hostable AI chat framework supporting 70+ model providers with multi-agent collaboration and plugins.
§01

What it is

LobeChat is an open-source AI chat framework built with Next.js that provides a polished chat interface, multi-agent collaboration, and a plugin marketplace. It supports over 70 model providers including OpenAI, Anthropic, Google, and local models.

LobeChat targets developers and teams who want a self-hosted chat interface with features like TTS, vision, file upload, and an agent marketplace -- without vendor lock-in to any single LLM provider.

§02

How it saves time or tokens

LobeChat provides a ready-to-use chat UI that would take weeks to build from scratch. The plugin system and agent marketplace let you extend functionality without writing custom integration code. Token usage is managed per-conversation with built-in token counting and model switching.

§03

How to use

  1. Run LobeChat with Docker:
docker run -d -p 3210:3210 lobehub/lobe-chat
  1. Open http://localhost:3210 and add your API key.
  1. Start chatting, install plugins, or browse the agent marketplace.
§04

Example

# Deploy LobeChat with environment variables
docker run -d -p 3210:3210 \
  -e OPENAI_API_KEY=sk-your-key \
  -e OPENAI_PROXY_URL=https://api.openai.com/v1 \
  -e ACCESS_CODE=your-access-code \
  lobehub/lobe-chat

# Or deploy to Vercel with one click
# Visit: https://github.com/lobehub/lobe-chat
# Click 'Deploy to Vercel' button
§05

Related on TokRepo

§06

Common pitfalls

  • LobeChat stores data in the browser by default (IndexedDB). For persistent server-side storage, configure a PostgreSQL database.
  • The Docker image exposes port 3210. If running behind a reverse proxy, ensure WebSocket connections are forwarded correctly for real-time features.
  • Some plugins require their own API keys or backend services. Check plugin documentation before installing to avoid broken integrations.

Frequently Asked Questions

What model providers does LobeChat support?+

LobeChat supports over 70 model providers including OpenAI, Anthropic, Google, Azure OpenAI, AWS Bedrock, Ollama, and many others. You can switch between providers per conversation without changing your setup.

Can LobeChat be self-hosted?+

Yes. LobeChat can be deployed via Docker, Vercel, or any Node.js hosting. The Docker image is the simplest option for self-hosting with full control over data and configuration.

Does LobeChat support local LLMs?+

Yes. LobeChat integrates with Ollama for local model inference. Configure the Ollama endpoint in settings and select local models in the model picker.

What is the LobeChat agent marketplace?+

The agent marketplace is a community-curated collection of pre-configured AI agents with specific system prompts and tool configurations. Users can install agents for tasks like coding, writing, translation, and research.

Does LobeChat support plugins?+

Yes. LobeChat has a plugin system that extends the chat interface with web search, image generation, code execution, and other capabilities. Plugins are installed from the built-in marketplace.

Citations (3)
🙏

Source & Thanks

Created by LobeHub. Licensed under MIT. lobehub/lobe-chat — 75,000+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets