ScriptsApr 8, 2026·3 min read

Open WebUI — Self-Hosted ChatGPT Alternative

Feature-rich open-source web UI for running local and remote LLMs. Open WebUI supports Ollama, OpenAI, Claude API with RAG, tools, multi-user, and mobile-friendly interface.

SC
Script Depot · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

# Docker (with Ollama)
docker run -d -p 3000:8080 \
  --add-host=host.docker.internal:host-gateway \
  -v open-webui:/app/backend/data \
  --name open-webui \
  ghcr.io/open-webui/open-webui:main

# Opens at http://localhost:3000
# Or pip install
pip install open-webui
open-webui serve

What is Open WebUI?

Open WebUI is a self-hosted, feature-rich web interface for LLMs. It provides a ChatGPT-like experience that works with local models (Ollama), cloud APIs (OpenAI, Claude), or any OpenAI-compatible endpoint. Key differentiators: built-in RAG, tool/function calling, multi-user with roles, model playground, and a mobile-responsive design. Everything runs on your infrastructure.

Answer-Ready: Open WebUI is a self-hosted ChatGPT alternative with Ollama, OpenAI, and Claude support. Built-in RAG, tools, multi-user roles, model playground, and mobile UI. Full privacy on your infrastructure. Most popular open-source LLM frontend. 60k+ GitHub stars.

Best for: Teams wanting a private, multi-user LLM interface. Works with: Ollama (local), OpenAI, Anthropic Claude, any OpenAI-compatible API. Setup time: Under 2 minutes.

Core Features

1. Multi-Provider Support

# In Settings → Connections:
- Ollama: http://host.docker.internal:11434
- OpenAI: sk-... (GPT-4o, GPT-4o-mini)
- Anthropic: sk-ant-... (Claude Sonnet, Haiku)
- Custom: Any OpenAI-compatible endpoint

2. Built-in RAG

Upload documents directly in chat:

  • PDF, DOCX, TXT, CSV
  • Web URLs (auto-scrape)
  • Code files
  • Automatic chunking and embedding

3. Tools & Functions

# Create custom tools in the UI
class Calculator:
    def add(self, a: float, b: float) -> float:
        return a + b

    def multiply(self, a: float, b: float) -> float:
        return a * b

4. Multi-User with Roles

Role Permissions
Admin Full control, user management
User Chat, upload docs, use tools
Pending Awaiting admin approval

5. Model Playground

Compare models side-by-side:

  • Send same prompt to multiple models
  • Compare response quality and speed
  • Helpful for model selection

6. Mobile Responsive

Full-featured mobile interface — chat, upload, switch models from phone or tablet.

Open WebUI vs Alternatives

Feature Open WebUI LobeChat Jan AnythingLLM
Multi-provider Yes Yes Yes Yes
Built-in RAG Yes Plugin Plugin Yes
Multi-user Yes (roles) No No Yes
Tools/Functions Yes Yes Limited Yes
Mobile UI Yes Yes No Limited
Stars 60k+ 55k+ 26k+ 35k+

FAQ

Q: Is it truly private? A: Yes, everything runs on your server. Use Ollama for fully local inference with zero data leaving your network.

Q: Can I use Claude with it? A: Yes, add your Anthropic API key in Settings → Connections. Claude Sonnet and Haiku are supported.

Q: How does RAG work? A: Upload files in chat or via the Documents page. Open WebUI chunks, embeds, and stores them. They are automatically retrieved during relevant conversations.

🙏

Source & Thanks

Created by Open WebUI. Licensed under MIT.

open-webui/open-webui — 60k+ stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets