Key Features
- Multi-model: LLMs, vision, voice, image, video in one platform
- No GPU required: Runs on CPU, optional GPU acceleration
- OpenAI + Anthropic API: Drop-in replacement for both providers
- 35+ backends: llama.cpp, Transformers, diffusers, whisper, and more
- AI agents: Autonomous agents with tool use, RAG, MCP
- Enterprise: Auth, quotas, RBAC, audit logging
- Model gallery: One-command install from HuggingFace, Ollama, OCI
FAQ
Q: What is LocalAI? A: LocalAI is a local AI engine with 44.6K+ stars running LLMs, vision, voice, and image models. OpenAI/Anthropic-compatible API, 35+ backends, no GPU required. MIT licensed.
Q: How do I install LocalAI?
A: docker run -p 8080:8080 localai/localai or brew install localai on macOS. Then local-ai run <model-name>.