What is Jan?
Jan is an open-source desktop application for running AI models locally. It provides a ChatGPT-like interface that works completely offline. Download models with one click, chat privately, and expose a local OpenAI-compatible API for integration with other tools. Your data never leaves your machine.
Answer-Ready: Jan is an open-source desktop app for running LLMs locally. ChatGPT-like UI, one-click model downloads, OpenAI-compatible local API. Supports Llama, Mistral, Gemma, and GGUF models. Fully offline, complete privacy. 26k+ GitHub stars.
Best for: Developers and privacy-conscious users wanting local AI. Works with: Claude Code (as local backend), Cursor, any OpenAI-compatible tool. Setup time: Under 2 minutes.
Core Features
1. One-Click Model Download
Built-in model hub with curated models:
- Llama 3.1 (8B, 70B)
- Mistral 7B, Mixtral
- Gemma 2
- Phi-3
- Any GGUF model from HuggingFace
2. OpenAI-Compatible API
from openai import OpenAI
client = OpenAI(base_url="http://localhost:1337/v1", api_key="not-needed")
response = client.chat.completions.create(
model="llama3.1-8b",
messages=[{"role": "user", "content": "Explain quantum computing"}],
)
print(response.choices[0].message.content)3. Extensions System
- TensorRT-LLM: NVIDIA GPU acceleration
- Remote API: Connect to OpenAI/Anthropic as fallback
- RAG: Local document Q&A
4. Cross-Platform
| Platform | GPU Support |
|---|---|
| macOS | Apple Silicon (Metal) |
| Windows | NVIDIA CUDA |
| Linux | NVIDIA CUDA, Vulkan |
Jan vs Alternatives
| Feature | Jan | Ollama | LM Studio |
|---|---|---|---|
| GUI | Full desktop app | CLI only | Full desktop app |
| API | OpenAI-compatible | OpenAI-compatible | OpenAI-compatible |
| Extensions | Plugin system | Limited | No |
| Open source | Yes (AGPL-3.0) | Yes | No |
| Model format | GGUF | GGUF, safetensors | GGUF |
FAQ
Q: What hardware do I need? A: 8GB RAM minimum for 7B models. 16GB+ recommended. Apple Silicon Macs work great with Metal acceleration.
Q: Can I use it as a backend for Claude Code or Cursor? A: Yes, Jan exposes an OpenAI-compatible API at localhost:1337. Point any tool that supports custom endpoints to it.
Q: Is it truly private? A: Yes, everything runs locally. No telemetry, no data collection. You can verify — it is open source.