Configs2026年4月1日·1 分钟阅读

Langfuse — Open Source LLM Observability

Langfuse is an open-source LLM engineering platform for tracing, prompt management, evaluation, and debugging AI apps. 24.1K+ GitHub stars. Self-hosted or cloud. MIT.

TO
TokRepo精选 · Community
快速使用

先拿来用,再决定要不要深挖

这里应该同时让用户和 Agent 知道第一步该复制什么、安装什么、落到哪里。

# Install
pip install langfuse openai

# Add tracing to your LLM calls
python -c "
from langfuse.openai import openai
# Automatic tracing — just swap the import
client = openai.OpenAI()
response = client.chat.completions.create(
    model='gpt-4o-mini',
    messages=[{'role': 'user', 'content': 'Hello!'}]
)
print(response.choices[0].message.content)
# Traces visible at http://localhost:3000
"

# Self-host with Docker
docker compose up -d  # from langfuse repo

介绍

Langfuse is an open-source LLM engineering platform that helps teams collaboratively develop, monitor, evaluate, and debug AI applications. With 24,100+ GitHub stars and MIT license, Langfuse provides LLM observability with detailed tracing of calls, retrieval, embeddings, and agent actions. It includes centralized prompt management with version control, evaluation pipelines (LLM-as-judge, user feedback, manual labeling), dataset testing, an interactive playground, and SDKs for Python and JavaScript. Self-host in minutes with Docker, Kubernetes, or deploy on AWS/Azure/GCP.

Best for: Teams building production AI apps who need observability, prompt management, and evaluation Works with: Claude Code, OpenAI Codex, Cursor, Gemini CLI, Windsurf Integrations: OpenAI, LangChain, LlamaIndex, Haystack, Vercel AI SDK


Key Features

  • LLM tracing: Track every call, retrieval, embedding, and agent action
  • Prompt management: Version control, collaboration, server/client caching
  • Evaluations: LLM-as-judge, user feedback, manual labeling, custom pipelines
  • Datasets: Benchmark testing for continuous improvement
  • Playground: Interactive prompt testing and iteration
  • Self-hosted: Docker, Kubernetes, Terraform (AWS/Azure/GCP)

FAQ

Q: What is Langfuse? A: Langfuse is an LLM observability platform with 24.1K+ stars for tracing, prompt management, and evaluation. Self-host with Docker or use cloud. MIT licensed.

Q: How do I install Langfuse? A: pip install langfuse openai. Swap from openai import OpenAI to from langfuse.openai import openai for automatic tracing.


🙏

来源与感谢

Created by Langfuse. Licensed under MIT. langfuse/langfuse — 24,100+ GitHub stars

相关资产