Esta página se muestra en inglés. Una traducción al español está en curso.
ConfigsMay 4, 2026·3 min de lectura

HuggingFace Chat UI — Open-Source AI Chat Interface

Chat UI is Hugging Face's open-source web interface for conversational AI, powering HuggingChat and supporting any text-generation model via TGI, Ollama, or OpenAI-compatible APIs with features like web search, tool use, and multimodal input.

Introduction

Chat UI is the open-source SvelteKit application that powers HuggingChat, Hugging Face's public AI assistant. It provides a polished conversational interface supporting multiple models, web search augmentation, tool calling, multimodal inputs, and assistant personas, all deployable on your own infrastructure.

What Chat UI Does

  • Provides a production-ready chat interface for any text-generation model
  • Supports multiple model backends: TGI, vLLM, Ollama, and OpenAI-compatible APIs
  • Enables web search integration to ground responses in current information
  • Offers tool use and function calling for models that support it
  • Manages multiple conversations with sharing, starring, and assistant presets

Architecture Overview

Chat UI is a SvelteKit application with server-side rendering. It uses MongoDB for conversation persistence, session management, and user accounts. The frontend streams tokens via Server-Sent Events. Model inference is delegated to configurable backends defined in environment variables, with each model having its own prompt template and parameters.

Self-Hosting & Configuration

  • Requires Node.js 18+ and a MongoDB instance (local or Atlas)
  • Define models in .env.local with endpoint URLs and prompt templates
  • Configure authentication via OpenID Connect or run without auth for internal use
  • Deploy with Docker using the provided Dockerfile or on any Node.js hosting
  • Enable web search by configuring a search API (SearXNG, Bing, or Google)

Key Features

  • Multi-model support with per-model system prompts and sampling parameters
  • Web search augmentation with source citations in responses
  • Image and file upload for multimodal models
  • Custom assistant personas with configurable instructions and tools
  • Conversation sharing via public links and export to JSON

Comparison with Similar Tools

  • Open WebUI — Python-based, Ollama-focused; Chat UI is JavaScript/SvelteKit with broader backend support
  • LobeChat — React-based with plugin marketplace; Chat UI is lighter and battle-tested at HuggingChat scale
  • LibreChat — multi-provider with file handling; Chat UI emphasizes HF ecosystem integration
  • Chatbox — desktop client; Chat UI is a web application for team deployments
  • text-generation-webui — Python/Gradio for power users; Chat UI offers a cleaner end-user experience

FAQ

Q: What models work with Chat UI? A: Any model served via TGI, vLLM, Ollama, or an OpenAI-compatible API. Configure the endpoint in .env.local.

Q: Is MongoDB required? A: Yes, for conversation storage and user sessions. A free MongoDB Atlas tier works for small deployments.

Q: Can I add web search to responses? A: Yes, configure a SearXNG instance or Bing/Google API key for RAG-style web search augmentation.

Q: How do I add authentication? A: Set OPENID_CONFIG in your environment to enable OAuth/OIDC login with any provider.

Sources

Discusión

Inicia sesión para unirte a la discusión.
Aún no hay comentarios. Sé el primero en compartir tus ideas.

Activos relacionados