# AnythingLLM — All-in-One AI Desktop with MCP > Full-stack AI desktop app with RAG, agents, MCP support, and multi-model chat. AnythingLLM manages documents, embeddings, and vector stores in one private interface. ## Install Merge the JSON below into your `.mcp.json`: ## Quick Use 1. Download from [anythingllm.com](https://anythingllm.com) (Mac/Windows/Linux) 2. Choose your LLM provider (OpenAI, Anthropic, Ollama, or 15+ others) 3. Upload documents → Ask questions with full RAG ```bash # Or run via Docker docker pull mintplexlabs/anythingllm docker run -p 3001:3001 mintplexlabs/anythingllm ``` ## What is AnythingLLM? AnythingLLM is a full-stack AI application that combines chat, RAG (document Q&A), agents, and MCP server support into one desktop app. Upload PDFs, websites, or code — AnythingLLM handles embedding, vector storage, and retrieval automatically. Connect any LLM provider, use built-in agents with tool calling, and extend functionality through MCP servers. Everything runs privately on your machine. **Answer-Ready**: AnythingLLM is an all-in-one AI desktop app with chat, RAG, agents, and MCP support. Upload documents, connect any LLM, ask questions with automatic retrieval. Built-in vector database, multi-user support, and agent workspace. 35k+ GitHub stars. **Best for**: Teams wanting private, self-hosted AI with document Q&A. **Works with**: OpenAI, Anthropic Claude, Ollama, Azure, and 15+ LLM providers. **Setup time**: Under 3 minutes. ## Core Features ### 1. Document RAG (Zero Config) Upload any document type and ask questions: - PDF, DOCX, TXT, CSV - Websites (auto-scrape) - YouTube transcripts - GitHub repos - Confluence, Notion exports ### 2. MCP Server Support ```json // Connect MCP servers in settings { "mcpServers": { "filesystem": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path"] }, "github": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-github"] } } } ``` ### 3. Multi-Provider LLM Support | Provider | Models | |----------|--------| | OpenAI | GPT-4o, GPT-4o-mini | | Anthropic | Claude Sonnet, Claude Haiku | | Ollama | Llama, Mistral, Gemma (local) | | Azure | Azure OpenAI | | AWS Bedrock | All Bedrock models | | Google | Gemini Pro | | LM Studio | Any local model | | OpenRouter | 100+ models | ### 4. Agent Workspace Built-in agent capabilities: - Web browsing and search - Code execution - File management - RAG-enhanced responses - MCP tool calling ### 5. Multi-User & Permissions ``` Admin → Manage users, workspaces, models Manager → Create workspaces, upload docs Default → Chat within assigned workspaces ``` ### 6. Built-in Vector Database No external setup needed — AnythingLLM includes LanceDB. Also supports: - Pinecone - Chroma - Weaviate - Qdrant - Milvus ## Use Cases | Use Case | How | |----------|-----| | Company Wiki Q&A | Upload docs → RAG workspace | | Code Assistant | Connect GitHub MCP + Ollama | | Research | Upload papers → Ask questions | | Customer Support | Upload knowledge base → Agent | ## FAQ **Q: Is it truly private?** A: Yes, desktop app runs fully local. Use Ollama for local models and LanceDB (built-in) for vectors. Nothing leaves your machine. **Q: How does MCP integration work?** A: Configure MCP servers in settings. Agents can call MCP tools alongside built-in tools for extended capabilities. **Q: Can multiple people use it?** A: Yes, multi-user support with role-based access control. Deploy via Docker for team use. ## Source & Thanks > Created by [Mintplex Labs](https://github.com/Mintplex-Labs). Licensed under MIT. > > [Mintplex-Labs/anything-llm](https://github.com/Mintplex-Labs/anything-llm) — 35k+ stars ## 快速使用 下载桌面应用 → 选择 LLM → 上传文档 → 开始问答。 ## 什么是 AnythingLLM? AnythingLLM 是全栈 AI 桌面应用,集成聊天、RAG、Agent 和 MCP 支持。上传文档自动处理嵌入和检索,支持 15+ LLM 提供商。 **一句话总结**:全栈 AI 桌面应用,集成 RAG + Agent + MCP,支持 15+ LLM,内置向量数据库,多用户权限,35k+ stars。 **适合人群**:需要私有化部署 AI 文档问答的团队。 ## 核心功能 ### 1. 零配置 RAG 上传 PDF/网页/代码,自动嵌入和检索。 ### 2. MCP 服务器 配置 MCP 服务器扩展 Agent 能力。 ### 3. 多模型支持 OpenAI、Claude、Ollama、Bedrock 等 15+。 ### 4. 多用户 角色权限控制,Docker 部署团队使用。 ## 常见问题 **Q: 真正私有?** A: 桌面应用本地运行,配合 Ollama 本地模型,数据不离开电脑。 **Q: MCP 怎么用?** A: 设置中配置 MCP 服务器,Agent 自动调用工具。 ## 来源与致谢 > [Mintplex-Labs/anything-llm](https://github.com/Mintplex-Labs/anything-llm) — 35k+ stars, MIT --- Source: https://tokrepo.com/en/workflows/fa84339e-40e0-44b0-8aff-3f1b741e10bf Author: MCP Hub