# LangFuse — Open Source LLM Observability & Tracing > Trace, evaluate, and monitor LLM applications in production. Open-source alternative to LangSmith with prompt management, cost tracking, and evaluation pipelines. ## Install Save in your project root: ## Quick Use ```bash pip install langfuse ``` ```python from langfuse import Langfuse langfuse = Langfuse( public_key="pk-...", secret_key="sk-...", host="https://cloud.langfuse.com", ) # Trace a generation trace = langfuse.trace(name="chat") generation = trace.generation( name="llm-call", model="gpt-4o", input=[{"role": "user", "content": "Hello"}], output="Hi there!", usage={"input": 10, "output": 5}, ) langfuse.flush() ``` ## What is LangFuse? LangFuse is an open-source observability platform for LLM applications. It provides tracing, prompt management, evaluation, and cost analytics — helping teams debug, improve, and monitor their AI features in production. **Answer-Ready**: LangFuse is an open-source LLM observability platform providing tracing, prompt management, evaluation pipelines, and cost analytics for production AI applications. ## Core Features ### 1. Distributed Tracing Trace complex chains and agent workflows: ```python from langfuse.decorators import observe @observe() def my_agent(query: str): context = retrieve_docs(query) return generate_response(query, context) @observe() def retrieve_docs(query: str): # Automatically nested as child span return vector_db.search(query) @observe() def generate_response(query: str, context: str): return openai.chat.completions.create(...) ``` ### 2. Framework Integrations ```python # OpenAI SDK (drop-in) from langfuse.openai import openai # All calls automatically traced # LangChain from langfuse.callback import CallbackHandler handler = CallbackHandler() chain.invoke({"input": "..."}, config={"callbacks": [handler]}) # LlamaIndex from llama_index.core import Settings Settings.callback_manager.add_handler(langfuse_handler) ``` ### 3. Prompt Management Version and deploy prompts from the LangFuse UI: ```python prompt = langfuse.get_prompt("customer-support-v2") compiled = prompt.compile(customer_name="Alice") ``` ### 4. Evaluation Pipelines Score traces manually or with LLM-as-judge: ```python langfuse.score( trace_id="trace-123", name="helpfulness", value=0.9, comment="Accurate and complete", ) ``` ### 5. Cost Dashboard Automatic cost calculation per model, per user, per feature. ## Self-Hosting ```bash docker compose up -d # PostgreSQL + LangFuse server ``` Or use the managed cloud at cloud.langfuse.com. ## FAQ **Q: How does it compare to LangSmith?** A: LangFuse is open-source and self-hostable. LangSmith is LangChain-specific and proprietary. **Q: Does it work without LangChain?** A: Yes, framework-agnostic. Works with any Python or JS/TS app. **Q: Production overhead?** A: Async by default — traces are batched and sent in background with < 1ms overhead. ## Source & Thanks - GitHub: [langfuse/langfuse](https://github.com/langfuse/langfuse) (10k+ stars) - Docs: [langfuse.com/docs](https://langfuse.com/docs) ## 快速使用 ```bash pip install langfuse ``` 三行代码开始追踪 LLM 调用,查看延迟、成本和输出质量。 ## 什么是 LangFuse? LangFuse 是开源 LLM 可观测性平台,提供追踪、提示管理、评估和成本分析——帮助团队调试、改进和监控生产环境中的 AI 功能。 **一句话总结**:LangFuse 是开源 LLM 可观测性平台,提供追踪、提示管理、评估管线和成本分析。 ## 核心功能 ### 1. 分布式追踪 用 `@observe()` 装饰器自动追踪复杂链和代理工作流。 ### 2. 框架集成 支持 OpenAI SDK、LangChain、LlamaIndex 等,一行代码接入。 ### 3. 提示管理 在 UI 中版本控制和部署提示模板。 ### 4. 评估管线 手动评分或用 LLM-as-Judge 自动评估。 ### 5. 成本仪表盘 按模型、用户、功能自动计算成本。 ## 自托管 ```bash docker compose up -d ``` 或使用托管云版本。 ## 常见问题 **Q: 和 LangSmith 比较?** A: LangFuse 开源可自托管,LangSmith 专属 LangChain 且闭源。 **Q: 生产环境开销?** A: 默认异步,后台批量发送,开销 < 1ms。 ## 来源与致谢 - GitHub: [langfuse/langfuse](https://github.com/langfuse/langfuse) (10k+ stars) --- Source: https://tokrepo.com/en/workflows/763445c0-8c1e-41de-97c8-9685e59521a7 Author: AI Open Source