# BAML — Type-Safe AI Function Framework > BAML adds engineering to prompt engineering with type-safe structured outputs. 7.8K+ stars. Python/TS/Ruby/Java, guaranteed schemas. Apache 2.0. ## Install Save in your project root: # BAML — Type-Safe AI Function Framework > Write prompts as typed functions with guaranteed output schemas. BAML adds engineering to prompt engineering. 7.8K+ GitHub stars. ## Quick Use 1. Install BAML: ```bash # Python pip install baml-py # TypeScript npm install @boundaryml/baml # VS Code extension (recommended) code --install-extension boundary.baml-extension ``` 2. Create a `.baml` file: ```baml // extract_resume.baml class Resume { name string email string skills string[] experience int @description("years of experience") } function ExtractResume(resume_text: string) -> Resume { client "openai/gpt-4o" prompt #" Extract structured data from this resume: {{ resume_text }} {{ ctx.output_format }} "# } ``` 3. Generate client code and use it: ```bash npx baml-cli generate ``` ```python from baml_client import b resume = b.ExtractResume("John Doe, john@email.com, 5 years Python...") print(resume.name) # "John Doe" — fully typed! print(resume.skills) # ["Python", ...] — guaranteed array ``` ## Intro BAML (Basically A Made-up Language) is a domain-specific language for defining AI functions with strict type contracts. Instead of wrestling with JSON parsing, regex extraction, or hoping your LLM returns the right format, BAML guarantees it. Key capabilities: - **Type-safe outputs**: Define schemas in `.baml` files, get typed objects in your code - **Multi-language**: Generate clients for Python, TypeScript, Ruby, and Java from one `.baml` source - **Any LLM provider**: OpenAI, Anthropic, Google, Ollama, Azure — swap with one line - **VS Code playground**: Test prompts live with real-time type checking - **Streaming with types**: Stream partial results that are still type-safe - **Retry & fallback**: Built-in retry logic with provider fallback chains - **Image & audio inputs**: Multimodal support out of the box ## FAQ **Q: How is this different from Instructor or Pydantic AI?** A: BAML is a dedicated DSL, not a Python library. You define schemas once in `.baml` and generate type-safe clients for any language. It also handles retries, streaming, and provider switching declaratively. **Q: Does it work with Claude?** A: Yes. Supports Anthropic Claude, OpenAI, Google Gemini, Ollama, Azure OpenAI, and any OpenAI-compatible API. **Q: What about complex nested types?** A: Fully supported — arrays, optionals, enums, unions, nested classes, recursive types. The type system is richer than JSON Schema. ## Works With - Claude Code, Cursor, Codex (via generated Python/TS clients) - VS Code (official BAML extension with playground) - Any LLM: OpenAI, Anthropic, Google, Ollama, Azure - Python, TypeScript, Ruby, Java ## Source & Thanks - GitHub: https://github.com/BoundaryML/baml (7.8K+ stars) - License: Apache 2.0 - Docs: https://docs.boundaryml.com - Maintainer: BoundaryML --- Source: https://tokrepo.com/en/workflows/49bc0525-3802-437c-957c-2b53082c4760 Author: AI Open Source