CLI ToolsApr 1, 2026·1 min read

LLM — CLI Tool for 100+ Language Models

LLM is a CLI and Python library for accessing 100+ LLMs via APIs or locally. 11.5K+ stars. SQLite logging, embeddings, structured data. Apache 2.0.

TL;DR
LLM is a CLI and Python library that gives you access to 100+ LLMs with built-in logging and embeddings.
§01

What it is

LLM is a command-line tool and Python library by Simon Willison that provides a unified interface to 100+ language models. It works with both API-based models (OpenAI, Anthropic, Google) and local models (via plugins). Every prompt and response is automatically logged to a SQLite database for later analysis.

Developers, researchers, and power users who work with multiple LLM providers and want a single CLI to query them all will find LLM particularly useful. It eliminates the need to install separate SDKs for each provider.

§02

How it saves time or tokens

LLM's plugin architecture means you install once and add providers as needed. The SQLite logging captures every interaction automatically, so you never lose a useful response. The embeddings subsystem lets you build semantic search over your logged conversations without a separate vector database.

§03

How to use

  1. Install LLM via pip and configure your API keys for the providers you want to use.
  2. Run prompts from the command line with llm or use the Python API in scripts.
  3. Query your conversation history using llm logs and the built-in SQLite database.
§04

Example

# Install and configure
pip install llm
llm keys set openai

# Simple prompt
llm 'Explain the CAP theorem in three sentences'

# Use a specific model
llm -m claude-3.5-sonnet 'Write a Python decorator for retry logic'

# View conversation logs
llm logs -n 5

# Generate embeddings
llm embed -m ada-002 -c 'vector search query'
§05

Related on TokRepo

§06

Common pitfalls

  • Not installing provider plugins. LLM ships with OpenAI support by default; other providers like Anthropic or local models require separate plugin installation via llm install llm-anthropic.
  • Forgetting that logs accumulate. The SQLite database grows with every prompt. Periodically review and clean old logs if disk space is a concern.
  • Assuming all models support the same features. Streaming, system prompts, and tool use vary by provider. Check the plugin documentation for model-specific capabilities.

Frequently Asked Questions

How many models does LLM support?+

LLM supports 100+ models through its plugin system. Built-in support covers OpenAI models. Additional plugins add Anthropic, Google, Mistral, local models via Ollama, and many more.

Is LLM free to use?+

LLM itself is free and open source under the Apache 2.0 license. You pay only for the API calls to commercial providers like OpenAI or Anthropic. Local models run at no additional cost.

Does LLM support local models?+

Yes. Through plugins like llm-ollama and llm-gpt4all, LLM can query locally running models. The same CLI interface works regardless of whether the model is remote or local.

What is the SQLite logging feature?+

Every prompt you send and every response you receive is automatically stored in a local SQLite database. You can query this database with SQL, search your history, and analyze token usage patterns.

Can I use LLM as a Python library?+

Yes. LLM provides a Python API alongside the CLI. You can import it in scripts, use it in notebooks, and integrate it into larger applications while retaining the automatic logging feature.

Citations (3)
🙏

Source & Thanks

Created by Simon Willison. Apache 2.0. simonw/llm — 11,500+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets