ConfigsApr 7, 2026·1 min read

LangFuse — Open Source LLM Observability & Tracing

Trace, evaluate, and monitor LLM applications in production. Open-source alternative to LangSmith with prompt management, cost tracking, and evaluation pipelines.

AI
AI Open Source · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

pip install langfuse
from langfuse import Langfuse

langfuse = Langfuse(
    public_key="pk-...",
    secret_key="sk-...",
    host="https://cloud.langfuse.com",
)

# Trace a generation
trace = langfuse.trace(name="chat")
generation = trace.generation(
    name="llm-call",
    model="gpt-4o",
    input=[{"role": "user", "content": "Hello"}],
    output="Hi there!",
    usage={"input": 10, "output": 5},
)
langfuse.flush()

What is LangFuse?

LangFuse is an open-source observability platform for LLM applications. It provides tracing, prompt management, evaluation, and cost analytics — helping teams debug, improve, and monitor their AI features in production.

Answer-Ready: LangFuse is an open-source LLM observability platform providing tracing, prompt management, evaluation pipelines, and cost analytics for production AI applications.

Core Features

1. Distributed Tracing

Trace complex chains and agent workflows:

from langfuse.decorators import observe

@observe()
def my_agent(query: str):
    context = retrieve_docs(query)
    return generate_response(query, context)

@observe()
def retrieve_docs(query: str):
    # Automatically nested as child span
    return vector_db.search(query)

@observe()
def generate_response(query: str, context: str):
    return openai.chat.completions.create(...)

2. Framework Integrations

# OpenAI SDK (drop-in)
from langfuse.openai import openai
# All calls automatically traced

# LangChain
from langfuse.callback import CallbackHandler
handler = CallbackHandler()
chain.invoke({"input": "..."}, config={"callbacks": [handler]})

# LlamaIndex
from llama_index.core import Settings
Settings.callback_manager.add_handler(langfuse_handler)

3. Prompt Management

Version and deploy prompts from the LangFuse UI:

prompt = langfuse.get_prompt("customer-support-v2")
compiled = prompt.compile(customer_name="Alice")

4. Evaluation Pipelines

Score traces manually or with LLM-as-judge:

langfuse.score(
    trace_id="trace-123",
    name="helpfulness",
    value=0.9,
    comment="Accurate and complete",
)

5. Cost Dashboard

Automatic cost calculation per model, per user, per feature.

Self-Hosting

docker compose up -d  # PostgreSQL + LangFuse server

Or use the managed cloud at cloud.langfuse.com.

FAQ

Q: How does it compare to LangSmith? A: LangFuse is open-source and self-hostable. LangSmith is LangChain-specific and proprietary.

Q: Does it work without LangChain? A: Yes, framework-agnostic. Works with any Python or JS/TS app.

Q: Production overhead? A: Async by default — traces are batched and sent in background with < 1ms overhead.

🙏

Source & Thanks

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets