What is Langtrace?
Langtrace is an open-source AI observability platform that automatically traces LLM calls from OpenAI, Anthropic, LangChain, and 20+ other providers — based on the OpenTelemetry standard.
In one sentence: Open-source AI observability platform with OpenTelemetry-native LLM call tracing, supporting latency metrics and cost tracking.
For: Teams running LLM applications in production who need observability.
Core Features
1. Automatic Tracing
Enable with one line — supports 20+ providers.
2. OpenTelemetry Native
Export to any OTel-compatible backend.
3. Real-Time Dashboard
Latency, token usage, cost, and error rate at a glance.
4. Self-Hostable
Deploy with Docker Compose in one command.
FAQ
Q: How does it compare to LangFuse? A: Langtrace is built on the OpenTelemetry standard with broader automatic trace coverage.
Q: Latency overhead? A: Async sending — less than 1ms per call.