ConfigsMay 4, 2026·3 min read

MaxKB — Self-Hosted AI Knowledge Base with RAG

MaxKB is an open-source knowledge base platform that combines document management with retrieval-augmented generation, letting teams build AI-powered Q&A systems over their own documents without sending data to third parties.

Introduction

MaxKB is a self-hosted knowledge base system with built-in RAG (Retrieval-Augmented Generation) capabilities. It allows organizations to upload documents, automatically chunk and embed them, and create AI chatbots that answer questions grounded in their proprietary data. Built by the 1Panel team, it emphasizes ease of deployment and privacy.

What MaxKB Does

  • Ingests documents (PDF, Word, Markdown, web pages) and builds a searchable vector store
  • Provides RAG-powered Q&A that grounds LLM responses in your uploaded documents
  • Supports multiple LLM backends: OpenAI, local models via Ollama, and Chinese providers
  • Offers a visual workflow editor for building multi-step AI applications
  • Enables embeddable chat widgets for websites and internal tools

Architecture Overview

MaxKB runs as a Django-based backend with a Vue.js frontend, backed by PostgreSQL with pgvector for vector storage. Documents are processed through a chunking pipeline, embedded via configurable models, and stored for similarity search. At query time, relevant chunks are retrieved and injected into the LLM prompt as context.

Self-Hosting & Configuration

  • Deploy via Docker with a single command; includes PostgreSQL and all dependencies
  • Configure LLM providers in the web UI (supports OpenAI API, Ollama, Xinference)
  • Set embedding model and chunk size parameters per knowledge base
  • Mount persistent volume for database storage and uploaded documents
  • Supports SSO integration and role-based access control for teams

Key Features

  • One-click Docker deployment with minimal configuration
  • Multi-format document import with automatic chunking and embedding
  • Visual application builder for creating multi-step AI workflows
  • Embeddable chat widget with customizable appearance
  • Support for both cloud LLMs and fully offline local model deployments

Comparison with Similar Tools

  • AnythingLLM — desktop-focused; MaxKB is web-first with team collaboration features
  • Dify — broader LLMOps platform; MaxKB specializes in document Q&A with simpler setup
  • RAGFlow — advanced parsing engine; MaxKB offers a more turnkey deployment experience
  • Quivr — Python-native RAG; MaxKB provides a polished web UI out of the box
  • FastGPT — similar feature set; MaxKB integrates tightly with 1Panel for server management

FAQ

Q: What LLM providers does MaxKB support? A: OpenAI, Azure OpenAI, Ollama, vLLM, Xinference, Tongyi Qianwen, and any OpenAI-compatible API.

Q: Can it run fully offline? A: Yes, pair it with Ollama or a local embedding model for air-gapped deployments.

Q: What document formats are supported? A: PDF, DOCX, Markdown, TXT, HTML, and web URL crawling.

Q: Is there a limit on knowledge base size? A: No hard limit. Performance depends on your PostgreSQL and hardware resources.

Sources

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets