Cette page est affichée en anglais. Une traduction française est en cours.
ScriptsApr 7, 2026·1 min de lecture

Instructor — Structured LLM Outputs with Pydantic

Extract structured data from LLMs using Pydantic models. Works with OpenAI, Anthropic, Gemini, and local models. The simplest way to get reliable JSON from any LLM.

What is Instructor?

Instructor patches LLM client libraries to return validated Pydantic objects instead of raw text. It handles retries, streaming, and partial responses — making structured extraction reliable across any provider.

Answer-Ready: Instructor is a Python library that extracts structured, validated data from LLMs using Pydantic models, supporting OpenAI, Anthropic, Gemini, and local models with automatic retry logic.

Key Patterns

1. Multi-Provider Support

# Anthropic
import instructor
from anthropic import Anthropic
client = instructor.from_anthropic(Anthropic())

# Gemini
import instructor
import google.generativeai as genai
client = instructor.from_gemini(genai.GenerativeModel("gemini-1.5-pro"))

# Ollama (local)
from openai import OpenAI
client = instructor.from_openai(
    OpenAI(base_url="http://localhost:11434/v1"),
    mode=instructor.Mode.JSON,
)

2. Nested & Complex Types

from typing import List
from pydantic import BaseModel

class Address(BaseModel):
    street: str
    city: str
    country: str

class Company(BaseModel):
    name: str
    industry: str
    addresses: List[Address]
    employee_count: int

company = client.chat.completions.create(
    model="gpt-4o",
    response_model=Company,
    messages=[{"role": "user", "content": "..."}],
)

3. Streaming Partial Results

for partial in client.chat.completions.create_partial(
    model="gpt-4o",
    response_model=User,
    messages=[{"role": "user", "content": "John is 30"}],
):
    print(partial)  # Progressively filled fields

4. Automatic Retries

user = client.chat.completions.create(
    model="gpt-4o",
    response_model=User,
    messages=[...],
    max_retries=3,  # Retries with validation errors fed back
)

FAQ

Q: How does it differ from function calling? A: Instructor builds on function calling but adds Pydantic validation, automatic retries with error feedback, streaming, and multi-provider support.

Q: Does it work with Claude? A: Yes, via instructor.from_anthropic() with full tool-use support.

Q: Performance overhead? A: Minimal — it is a thin wrapper. Retries add latency only when validation fails.

🙏

Source et remerciements

Discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires