ConfigsApr 6, 2026·2 min read

LobeChat — Open-Source Multi-Model Chat UI

Beautiful open-source chat UI supporting Claude, GPT-4, Gemini, Ollama, and 50+ providers. Plugin system, knowledge base, TTS, image generation, and self-hostable. 55,000+ GitHub stars.

AI
AI Open Source · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

# Docker one-line deploy
docker run -d -p 3210:3210 lobehub/lobe-chat

Open http://localhost:3210. Add your API keys in Settings > Model Providers.

Or use the hosted version at lobechat.com.


Intro

LobeChat is a beautiful, open-source chat UI that supports Claude, GPT-4, Gemini, Ollama, and 50+ model providers with 55,000+ GitHub stars. It offers a plugin ecosystem, knowledge base (RAG), text-to-speech, image generation, and multi-model conversations — all self-hostable with Docker. Think of it as a self-hosted ChatGPT Pro that works with any model. Best for developers and teams who want a polished AI chat interface without vendor lock-in. Works with: Claude, GPT-4, Gemini, Ollama, any OpenAI-compatible API. Setup time: under 2 minutes.


Key Features

Multi-Model Support

Use any model from any provider in one interface:

Provider Models
Anthropic Claude Opus, Sonnet, Haiku
OpenAI GPT-4o, o1, GPT-3.5
Google Gemini 2.5 Pro, Flash
Ollama Llama 3, Mistral, CodeStral
Azure Azure OpenAI
AWS Bedrock models

Switch models mid-conversation or compare responses side-by-side.

Plugin System

Install plugins from the marketplace:

  • Web search (Tavily, Google)
  • Code interpreter
  • Image generation (DALL-E, Midjourney)
  • Weather, calculator, translator
  • Custom plugins via OpenAPI spec

Knowledge Base (RAG)

Upload documents and chat with them:

  • PDF, DOCX, TXT, Markdown
  • Automatic chunking and embedding
  • Citation tracking
  • Multi-document conversations

Text-to-Speech

Built-in TTS with multiple voices:

  • OpenAI TTS
  • Edge TTS (free)
  • ElevenLabs

Beautiful UI

  • Dark/light mode
  • Mobile responsive
  • Custom themes
  • Conversation folders
  • Markdown rendering with code highlighting

Self-Hosting Options

# Docker
docker run -d -p 3210:3210 lobehub/lobe-chat

# Docker Compose (with database)
docker compose up -d

# Vercel (one-click)
# Use the "Deploy to Vercel" button on GitHub

Key Stats

  • 55,000+ GitHub stars
  • 50+ model providers
  • Plugin marketplace
  • RAG knowledge base
  • Self-hostable with Docker

FAQ

Q: What is LobeChat? A: LobeChat is an open-source chat UI supporting 50+ AI model providers with plugins, knowledge base, TTS, and self-hosting — like a self-hosted ChatGPT Pro.

Q: Is LobeChat free? A: Yes, fully open-source under Apache 2.0. Self-host for free with your own API keys.

Q: Can I use LobeChat with local models? A: Yes, connect Ollama at http://localhost:11434 and use any local model.


🙏

Source & Thanks

Created by LobeHub. Licensed under Apache 2.0.

lobe-chat — ⭐ 55,000+

Thanks to LobeHub for building the most beautiful open-source AI chat experience.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets