MCP Configs2026年4月8日·1 分钟阅读

AnythingLLM — All-in-One AI Desktop with MCP

Full-stack AI desktop app with RAG, agents, MCP support, and multi-model chat. AnythingLLM manages documents, embeddings, and vector stores in one private interface.

What is AnythingLLM?

AnythingLLM is a full-stack AI desktop app integrating chat, RAG, agents, and MCP support. Upload documents for automatic embedding and retrieval, with 15+ LLM providers.

In one sentence: Full-stack AI desktop app integrating RAG + agents + MCP, 15+ LLMs, built-in vector database, multi-user permissions — 35k+ stars.

For: Teams needing private AI document Q&A deployments.

Core Features

1. Zero-Config RAG

Upload PDFs/web pages/code — automatic embedding and retrieval.

2. MCP Servers

Configure MCP servers to extend agent capabilities.

3. Multi-Model Support

OpenAI, Claude, Ollama, Bedrock, and 15+ more.

4. Multi-User

Role-based permissions with Docker deployment for teams.

FAQ

Q: Truly private? A: Desktop app runs locally — pair with Ollama for local models and data never leaves your machine.

Q: How does MCP work? A: Configure MCP servers in settings — agents call tools automatically.

🙏

来源与感谢

Mintplex-Labs/anything-llm — 35k+ stars, MIT

讨论

登录后参与讨论。
还没有评论,来写第一条吧。