OpenAI Cookbook — Official Prompting Guides
Official prompting guides from OpenAI: GPT-5.2, Codex, Meta Prompting, and Realtime API guides. The definitive reference for OpenAI model optimization.
What it is
OpenAI Cookbook is the official collection of prompting guides and code examples maintained by OpenAI. It covers GPT-5.2, Codex, Meta Prompting, and the Realtime API, serving as the definitive reference for anyone working with OpenAI models.
This resource is aimed at developers, prompt engineers, and AI practitioners who want to move beyond trial-and-error prompting and adopt patterns validated by OpenAI themselves.
How it saves time or tokens
By following official best practices, you avoid common anti-patterns that waste tokens on retries and poorly structured prompts. The cookbook demonstrates techniques like system message structuring, few-shot formatting, and output constraining that reduce token consumption per request. Each guide distills lessons from OpenAI's internal testing, saving you hours of experimentation.
How to use
- Browse the OpenAI Cookbook repository on GitHub to find the guide relevant to your model or use case.
- Follow the step-by-step code examples, adapting them to your application's language and framework.
- Apply the prompting patterns (few-shot, chain-of-thought, meta prompting) to your production prompts and measure the improvement in output quality.
Example
# Meta Prompting pattern from OpenAI Cookbook
system_prompt = '''
You are an expert prompt engineer.
Given a task description, generate an optimized prompt
that maximizes accuracy and minimizes token usage.
'''
response = client.chat.completions.create(
model='gpt-4o',
messages=[
{'role': 'system', 'content': system_prompt},
{'role': 'user', 'content': 'Write a prompt for summarizing legal documents'}
]
)
print(response.choices[0].message.content)
Related on TokRepo
- Prompt Library — Browse community-contributed prompts across models and use cases
- AI Tools for Coding — Developer tools that integrate LLM prompting into workflows
Common pitfalls
- Copying prompts verbatim without adapting them to your specific model version or API endpoint.
- Ignoring the distinction between system, user, and assistant roles when structuring multi-turn conversations.
- Over-engineering prompts with excessive instructions when a concise few-shot example would perform better.
Frequently Asked Questions
The cookbook covers GPT-5.2, Codex, and earlier GPT-4 variants. It also includes guides for the Realtime API and Meta Prompting techniques that apply across models.
Yes. The entire cookbook is open source and hosted on GitHub under the OpenAI organization. All code examples and guides are freely available.
OpenAI updates the cookbook as new models and API features are released. Contributors also submit pull requests with community-tested patterns and examples.
Yes. The code examples are designed as starting points for production applications. You should adapt parameters like temperature, max tokens, and system prompts to your specific requirements.
Meta Prompting is a technique where you use an LLM to generate or refine prompts for another task. The cookbook provides patterns for building prompt-generation pipelines that improve output quality iteratively.
Citations (3)
- OpenAI Cookbook GitHub— Official prompting guides from OpenAI
- OpenAI Platform Docs— Meta Prompting technique for prompt generation
- OpenAI API Reference— GPT best practices for structured outputs
Related on TokRepo
Source & Thanks
Created by OpenAI. Licensed under MIT. openai-cookbook cookbook.openai.com