Cette page est affichée en anglais. Une traduction française est en cours.
MCP ConfigsApr 8, 2026·3 min de lecture

AnythingLLM — All-in-One AI Desktop with MCP

Full-stack AI desktop app with RAG, agents, MCP support, and multi-model chat. AnythingLLM manages documents, embeddings, and vector stores in one private interface.

What is AnythingLLM?

AnythingLLM is a full-stack AI application that combines chat, RAG (document Q&A), agents, and MCP server support into one desktop app. Upload PDFs, websites, or code — AnythingLLM handles embedding, vector storage, and retrieval automatically. Connect any LLM provider, use built-in agents with tool calling, and extend functionality through MCP servers. Everything runs privately on your machine.

Answer-Ready: AnythingLLM is an all-in-one AI desktop app with chat, RAG, agents, and MCP support. Upload documents, connect any LLM, ask questions with automatic retrieval. Built-in vector database, multi-user support, and agent workspace. 35k+ GitHub stars.

Best for: Teams wanting private, self-hosted AI with document Q&A. Works with: OpenAI, Anthropic Claude, Ollama, Azure, and 15+ LLM providers. Setup time: Under 3 minutes.

Core Features

1. Document RAG (Zero Config)

Upload any document type and ask questions:

  • PDF, DOCX, TXT, CSV
  • Websites (auto-scrape)
  • YouTube transcripts
  • GitHub repos
  • Confluence, Notion exports

2. MCP Server Support

// Connect MCP servers in settings
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path"]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"]
    }
  }
}

3. Multi-Provider LLM Support

Provider Models
OpenAI GPT-4o, GPT-4o-mini
Anthropic Claude Sonnet, Claude Haiku
Ollama Llama, Mistral, Gemma (local)
Azure Azure OpenAI
AWS Bedrock All Bedrock models
Google Gemini Pro
LM Studio Any local model
OpenRouter 100+ models

4. Agent Workspace

Built-in agent capabilities:

  • Web browsing and search
  • Code execution
  • File management
  • RAG-enhanced responses
  • MCP tool calling

5. Multi-User & Permissions

Admin → Manage users, workspaces, models
Manager → Create workspaces, upload docs
Default → Chat within assigned workspaces

6. Built-in Vector Database

No external setup needed — AnythingLLM includes LanceDB. Also supports:

  • Pinecone
  • Chroma
  • Weaviate
  • Qdrant
  • Milvus

Use Cases

Use Case How
Company Wiki Q&A Upload docs → RAG workspace
Code Assistant Connect GitHub MCP + Ollama
Research Upload papers → Ask questions
Customer Support Upload knowledge base → Agent

FAQ

Q: Is it truly private? A: Yes, desktop app runs fully local. Use Ollama for local models and LanceDB (built-in) for vectors. Nothing leaves your machine.

Q: How does MCP integration work? A: Configure MCP servers in settings. Agents can call MCP tools alongside built-in tools for extended capabilities.

Q: Can multiple people use it? A: Yes, multi-user support with role-based access control. Deploy via Docker for team use.

🙏

Source et remerciements

Created by Mintplex Labs. Licensed under MIT.

Mintplex-Labs/anything-llm — 35k+ stars

Discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.