Configs2026年4月1日·1 分钟阅读

LocalAI — Run Any AI Model Locally, No GPU

LocalAI is an open-source AI engine running LLMs, vision, voice, and image models locally. 44.6K+ GitHub stars. OpenAI/Anthropic-compatible API, 35+ backends, MCP, agents. MIT licensed.

TO
TokRepo精选 · Community
快速使用

先拿来用,再决定要不要深挖

这里应该同时让用户和 Agent 知道第一步该复制什么、安装什么、落到哪里。

# Run with Docker
docker run -p 8080:8080 localai/localai

# Or native install (macOS)
brew install localai

# Run a model from the gallery
local-ai run llama-3.2-1b-instruct:q4_k_m

# Query via OpenAI-compatible API
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
  "model": "llama-3.2-1b-instruct",
  "messages": [{"role": "user", "content": "Hello!"}]
}'

介绍

LocalAI is an open-source AI engine that lets you run LLMs, vision, voice, image, and video models locally without requiring a GPU. With 44,600+ GitHub stars and MIT license, it provides OpenAI and Anthropic-compatible APIs, supports 35+ backends across NVIDIA, AMD, Intel, Apple Silicon, Vulkan, and CPU-only hardware. LocalAI includes enterprise features like authentication, user quotas, RBAC, autonomous AI agents with tool use, RAG, and MCP integration. Run models from HuggingFace, Ollama registries, and OCI images.

Best for: Teams wanting a self-hosted AI platform with multi-model support and enterprise features Works with: Claude Code, OpenAI Codex, Cursor, Gemini CLI, Windsurf Hardware: NVIDIA, AMD, Intel, Apple Silicon, Vulkan, CPU-only


Key Features

  • Multi-model: LLMs, vision, voice, image, video in one platform
  • No GPU required: Runs on CPU, optional GPU acceleration
  • OpenAI + Anthropic API: Drop-in replacement for both providers
  • 35+ backends: llama.cpp, Transformers, diffusers, whisper, and more
  • AI agents: Autonomous agents with tool use, RAG, MCP
  • Enterprise: Auth, quotas, RBAC, audit logging
  • Model gallery: One-command install from HuggingFace, Ollama, OCI

FAQ

Q: What is LocalAI? A: LocalAI is a local AI engine with 44.6K+ stars running LLMs, vision, voice, and image models. OpenAI/Anthropic-compatible API, 35+ backends, no GPU required. MIT licensed.

Q: How do I install LocalAI? A: docker run -p 8080:8080 localai/localai or brew install localai on macOS. Then local-ai run <model-name>.


🙏

来源与感谢

Created by Ettore Di Giacinto. Licensed under MIT. mudler/LocalAI — 44,600+ GitHub stars

相关资产