ConfigsApr 6, 2026·2 min read

LobeChat — Open-Source Multi-Model Chat UI

Beautiful open-source chat UI supporting Claude, GPT-4, Gemini, Ollama, and 50+ providers. Plugin system, knowledge base, TTS, image generation, and self-hostable. 55,000+ GitHub stars.

TL;DR
LobeChat is an open-source multi-model chat UI supporting 50+ providers with plugins, vision, TTS, and self-hosted deployment.
§01

What it is

LobeChat is a beautiful open-source chat interface that supports Claude, GPT-4, Gemini, Ollama, and 50+ other model providers. It provides a unified chat experience with features like plugin support, vision (image understanding), text-to-speech, knowledge base integration, and multi-modal conversations. The application can be self-hosted or deployed to Vercel with one click.

Developers, teams, and individuals who want a polished chat UI that works with multiple AI providers without vendor lock-in benefit most. LobeChat eliminates the need to switch between different provider web interfaces.

§02

How it saves time or tokens

LobeChat centralizes all model interactions in one interface. Instead of maintaining accounts and switching tabs between ChatGPT, Claude, and Gemini, you configure all providers once and switch models mid-conversation. The plugin ecosystem extends the chat with web search, code execution, and file handling without building custom integrations. Self-hosting gives you full control over data retention and avoids per-seat pricing from commercial chat products.

§03

How to use

  1. Deploy LobeChat with Docker:
docker run -d -p 3210:3210 lobehub/lobe-chat
  1. Open http://localhost:3210 in your browser and configure your model providers in Settings:
OpenAI API Key: sk-...
Anthropic API Key: sk-ant-...
Ollama endpoint: http://localhost:11434
  1. Start chatting. Select your model from the dropdown and use features like vision, plugins, and knowledge base from the sidebar.
§04

Example

# One-click deploy to Vercel
npx create-lobe-chat@latest

# Docker Compose with database persistence
version: '3'
services:
  lobe-chat:
    image: lobehub/lobe-chat
    ports:
      - '3210:3210'
    environment:
      - OPENAI_API_KEY=sk-...
      - ANTHROPIC_API_KEY=sk-ant-...
    volumes:
      - ./data:/app/data
§05

Related on TokRepo

§06

Common pitfalls

  • Each model provider requires its own API key configured separately. LobeChat does not include API credits -- you bring your own keys and pay each provider directly.
  • Self-hosted deployments store chat history locally by default. Configure a database backend (Postgres) for production use to avoid data loss on container restart.
  • Plugin availability varies. Not all plugins work with all model providers. Check plugin compatibility with your chosen model before relying on it in workflows.

Frequently Asked Questions

Is LobeChat free?+

Yes. LobeChat is open source under the MIT license. The software is free to use, modify, and deploy. You pay only for the API usage of the model providers you configure (OpenAI, Anthropic, etc.).

Does LobeChat support local models?+

Yes. LobeChat integrates with Ollama and other local model servers. Point LobeChat to your local Ollama endpoint and use models like Llama, Mistral, or Phi without sending data to external APIs.

Can I use LobeChat with Claude?+

Yes. Configure your Anthropic API key in LobeChat settings. You can then select Claude models (Sonnet, Opus, Haiku) from the model dropdown and use them with all of LobeChat's features including vision and plugins.

How does LobeChat handle image inputs?+

LobeChat supports vision-capable models. Upload an image in the chat and select a model that supports vision (GPT-4 Vision, Claude with vision). The image is sent as part of the prompt and the model analyzes it in context.

Can multiple users share one LobeChat instance?+

Yes with the server deployment mode. LobeChat supports multi-user setups with individual accounts, conversation histories, and settings. This requires a database backend (Postgres) for user data persistence.

Citations (3)
🙏

Source & Thanks

Created by LobeHub. Licensed under Apache 2.0.

lobe-chat — ⭐ 55,000+

Thanks to LobeHub for building the most beautiful open-source AI chat experience.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets