ScriptsMar 31, 2026·2 min read

DSPy — Program LLMs Instead of Prompting

DSPy is a Python framework for programming language models instead of prompting them. 33.3K+ GitHub stars. Build modular AI systems — classifiers, RAG pipelines, agent loops — and let DSPy optimize pr

TO
TokRepo精选 · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

# Install
pip install dspy

# Quick start
python -c "
import dspy
lm = dspy.LM('openai/gpt-4o-mini')
dspy.configure(lm=lm)
print(dspy.ChainOfThought('question -> answer')(question='What is DSPy?'))
"

Intro

DSPy is a Python framework by Stanford NLP for programming language models instead of writing brittle prompts. With 33,300+ GitHub stars and MIT license, DSPy lets you build modular AI systems — classifiers, RAG pipelines, agent loops — using compositional Python code. Instead of manually crafting prompts, DSPy compiles your program and algorithmically optimizes the prompts and weights for high-quality outputs. It supports self-improving pipelines, in-context learning with demonstrations, and works with any LLM provider.

Best for: AI engineers building production LLM pipelines who want reliable, optimized outputs without prompt engineering Works with: Claude Code, OpenAI Codex, Cursor, Gemini CLI, Windsurf Providers: OpenAI, Anthropic, Google, local models via Ollama


Key Features

  • Programming over prompting: Write Python modules instead of fragile prompt strings
  • Automatic optimization: DSPy compiles and optimizes prompts and weights algorithmically
  • Modular design: Compose classifiers, retrieval, reasoning, and agent loops as reusable modules
  • Self-improving pipelines: Compilation learns from examples to improve output quality
  • Provider agnostic: Works with OpenAI, Anthropic, Google, local models, and any LLM API
  • Built-in modules: ChainOfThought, ReAct, RAG, multi-hop reasoning out of the box

Example

import dspy

# Configure LLM
lm = dspy.LM('openai/gpt-4o-mini')
dspy.configure(lm=lm)

# Define a module
classify = dspy.ChainOfThought('text -> sentiment: positive, negative, neutral')

# Use it
result = classify(text="DSPy is amazing for building reliable AI systems")
print(result.sentiment)  # positive

# Optimize with examples
optimizer = dspy.MIPROv2(metric=my_metric, auto="medium")
optimized = optimizer.compile(classify, trainset=examples)

FAQ

Q: What is DSPy? A: DSPy is a Stanford NLP framework with 33.3K+ stars for programming language models instead of prompting them. You write Python modules and DSPy automatically optimizes the prompts and weights for reliable, high-quality outputs. MIT licensed.

Q: How do I install DSPy? A: Run pip install dspy. Then configure your LLM provider with dspy.configure(lm=dspy.LM('openai/gpt-4o-mini')) and start building modules.

Q: How is DSPy different from LangChain? A: LangChain chains together prompts and tools manually. DSPy compiles and optimizes prompts algorithmically — you write what you want, DSPy figures out the best prompt automatically. DSPy is more focused on reliability and optimization.


🙏

Source & Thanks

Created by Stanford NLP. Licensed under MIT. stanfordnlp/dspy — 33,300+ GitHub stars

Related Assets