Practical Notes
Adopt BAML when your bottleneck is iteration speed and output stability. Model your prompt as a function: inputs are typed, outputs are typed, and the prompt body is versionable. Keep a small library of BAML functions (extract, classify, route, summarize) and test them in your IDE before wiring them into agents. Over time, you get a maintainable prompt surface instead of a pile of ad-hoc strings.
Safety note: Keep function signatures small; overly generic types and huge prompts slow iteration and reduce reliability.
FAQ
Q: Do I need to write my whole app in BAML? A: No. The repo emphasizes using BAML for prompts/functions and wiring it into your existing app.
Q: What makes it different from plain prompting? A: You define a function signature with types and let the tool enforce output structure, retries, and streaming-friendly interfaces.
Q: Is it Python-only? A: No. The repo lists Python, TypeScript, Ruby, Go, and more client options via generated clients or REST patterns.