Cette page est affichée en anglais. Une traduction française est en cours.
ScriptsApr 22, 2026·3 min de lecture

NextChat — Multi-Model AI Chat for Web, Desktop & Mobile

Cross-platform AI chat client supporting OpenAI, Claude, Gemini, and more with self-hosted deployment.

Introduction

NextChat (formerly ChatGPT-Next-Web) is a cross-platform AI chat client that connects to OpenAI, Azure OpenAI, Claude, Gemini, and other LLM providers through a single polished interface. It deploys in one click on Vercel or Docker and runs on web, desktop, iOS, and Android.

What NextChat Does

  • Provides a clean chat UI with markdown rendering and streaming responses
  • Supports multiple LLM providers including OpenAI, Anthropic Claude, Google Gemini, and local models
  • Offers cross-platform native apps for Windows, macOS, Linux, iOS, and Android
  • Manages conversation history, prompt templates, and model parameter tuning
  • Enables one-click self-hosted deployment via Vercel, Docker, or static export

Architecture Overview

NextChat is built with Next.js and TypeScript. The frontend communicates with LLM provider APIs directly from the client or through a lightweight proxy layer. Desktop builds use Tauri for minimal resource consumption. The app stores conversations in IndexedDB on the client side, keeping all data local by default.

Self-Hosting & Configuration

  • Deploy to Vercel with a single click using the repository template
  • Run via Docker with environment variables for API keys and base URLs
  • Configure model providers by setting OPENAI_API_KEY, AZURE_URL, GOOGLE_API_KEY, or ANTHROPIC_API_KEY
  • Set ACCESS_CODE to password-protect your deployment
  • Customize system prompts, model defaults, and UI theme through the settings panel

Key Features

  • Multi-provider support: switch between OpenAI, Claude, Gemini, and more in one interface
  • Cross-platform native apps built with Tauri for small binary size
  • Client-side data storage keeps conversations private by default
  • Prompt library with community-shared templates
  • Conversation export and import in markdown and JSON formats

Comparison with Similar Tools

  • Open WebUI — focuses on Ollama and local models; NextChat targets cloud API providers across platforms
  • LobeChat — similar feature set with plugin ecosystem; NextChat is lighter and more mobile-friendly
  • LibreChat — server-side multi-provider chat; NextChat stores data client-side
  • Cherry Studio — desktop-focused AI client; NextChat covers web and mobile as well
  • ChatBox — desktop app for LLM APIs; NextChat adds web deployment and native mobile apps

FAQ

Q: Which AI providers does NextChat support? A: OpenAI, Azure OpenAI, Anthropic Claude, Google Gemini, and any OpenAI-compatible API endpoint.

Q: Is my data stored on a server? A: No. Conversations are stored in your browser's IndexedDB by default. No server-side database is required.

Q: Can I run NextChat fully offline? A: The UI can be self-hosted, but it requires network access to reach the LLM provider APIs unless you point it at a local model server.

Q: How do I update a Docker deployment? A: Pull the latest image with docker pull and restart the container. Configuration persists through environment variables.

Sources

Discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires