Key Features
- LLM tracing: Track every call, retrieval, embedding, and agent action
- Prompt management: Version control, collaboration, server/client caching
- Evaluations: LLM-as-judge, user feedback, manual labeling, custom pipelines
- Datasets: Benchmark testing for continuous improvement
- Playground: Interactive prompt testing and iteration
- Self-hosted: Docker, Kubernetes, Terraform (AWS/Azure/GCP)
FAQ
Q: What is Langfuse? A: Langfuse is an LLM observability platform with 24.1K+ stars for tracing, prompt management, and evaluation. Self-host with Docker or use cloud. MIT licensed.
Q: How do I install Langfuse?
A: pip install langfuse openai. Swap from openai import OpenAI to from langfuse.openai import openai for automatic tracing.