Configs2026年4月7日·1 分钟阅读

Langtrace — Open Source AI Observability Platform

Open-source observability for LLM apps. Trace OpenAI, Anthropic, and LangChain calls with OpenTelemetry-native instrumentation and a real-time dashboard.

What is Langtrace?

Langtrace is an open-source AI observability platform that automatically traces LLM calls from OpenAI, Anthropic, LangChain, and 20+ other providers — based on the OpenTelemetry standard.

In one sentence: Open-source AI observability platform with OpenTelemetry-native LLM call tracing, supporting latency metrics and cost tracking.

For: Teams running LLM applications in production who need observability.

Core Features

1. Automatic Tracing

Enable with one line — supports 20+ providers.

2. OpenTelemetry Native

Export to any OTel-compatible backend.

3. Real-Time Dashboard

Latency, token usage, cost, and error rate at a glance.

4. Self-Hostable

Deploy with Docker Compose in one command.

FAQ

Q: How does it compare to LangFuse? A: Langtrace is built on the OpenTelemetry standard with broader automatic trace coverage.

Q: Latency overhead? A: Async sending — less than 1ms per call.

🙏

来源与感谢

Scale3-Labs/langtrace — 3k+ stars, AGPL-3.0

讨论

登录后参与讨论。
还没有评论,来写第一条吧。

相关资产