ConfigsMar 31, 2026·2 min read

Langfuse — Open Source LLM Observability

Langfuse is an open-source LLM engineering platform for tracing, prompt management, evaluation, and debugging AI apps. 24.1K+ GitHub stars. Self-hosted or cloud. MIT.

TL;DR
Langfuse provides open-source tracing, prompt management, and evaluation for LLM applications. Self-hosted or cloud, MIT licensed.
§01

What it is

Langfuse is an open-source LLM engineering platform that provides observability for AI applications. It traces every LLM call, tracks token usage and latency, manages prompt versions, and supports evaluation workflows. You integrate it with a few lines of code and get a dashboard showing how your AI application performs in production.

Langfuse targets AI engineers, ML teams, and product developers who build LLM-powered applications and need to understand cost, quality, and performance. It is available as a cloud service or self-hosted under the MIT license.

§02

How it saves time or tokens

Without observability, debugging LLM applications means adding print statements, manually counting tokens, and guessing why outputs degrade. Langfuse automatically traces every call, records inputs/outputs, measures latency, and calculates costs. Prompt management lets you version and A/B test prompts without code changes. This visibility helps you identify expensive or slow calls and optimize them, directly reducing token waste.

§03

How to use

  1. Install the SDK:
pip install langfuse openai
  1. Add tracing with a one-line import swap:
from langfuse.openai import openai

client = openai.OpenAI()
response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{'role': 'user', 'content': 'Hello'}]
)
  1. View traces in the Langfuse dashboard at cloud.langfuse.com or your self-hosted instance.
  1. Use the prompt management UI to version and deploy prompts without redeploying code.
§04

Example

from langfuse import Langfuse

langfuse = Langfuse()

# Create a trace for a multi-step workflow
trace = langfuse.trace(name='rag-pipeline')

# Span for retrieval step
retrieval = trace.span(name='retrieval')
# ... your retrieval logic ...
retrieval.end(output={'docs_found': 5})

# Generation span for LLM call
generation = trace.generation(
    name='answer-generation',
    model='gpt-4o',
    input=[{'role': 'user', 'content': 'question'}]
)
# ... your LLM call ...
generation.end(output='answer text', usage={'input': 150, 'output': 200})
§05

Related on TokRepo

§06

Common pitfalls

  • Tracing adds a small latency overhead per call. For latency-sensitive applications, use async flushing (enabled by default) and batch spans.
  • Self-hosted Langfuse requires PostgreSQL and ClickHouse. Plan for database maintenance and storage growth as trace volume increases.
  • Prompt management works best when prompts are fetched at runtime. Hardcoded prompts in code bypass the versioning system entirely.

Frequently Asked Questions

Is Langfuse free?+

Langfuse is open source under the MIT license. Self-hosting is completely free. The cloud-hosted version has a free tier with usage limits and paid plans for higher volume.

What LLM providers does Langfuse support?+

Langfuse integrates with OpenAI, Anthropic, Google, AWS Bedrock, Azure OpenAI, and any provider via the generic SDK. Framework integrations exist for LangChain, LlamaIndex, and Haystack.

Can I self-host Langfuse?+

Yes. Langfuse provides Docker images and Helm charts for self-hosting. It requires PostgreSQL and ClickHouse. The self-hosted version has full feature parity with the cloud version.

How does Langfuse compare to LangSmith?+

LangSmith is LangChain's proprietary observability platform. Langfuse is open source, framework-agnostic, and self-hostable. If you use LangChain exclusively, LangSmith has deeper integration. If you want vendor independence, Langfuse is the better choice.

Does Langfuse support evaluation?+

Yes. Langfuse supports manual annotation, model-based evaluation, and custom scoring functions. You can define evaluation criteria and score traces programmatically or through the UI.

Citations (3)
🙏

Source & Thanks

Created by Langfuse. Licensed under MIT. langfuse/langfuse — 24,100+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets