# Langfuse — Open Source LLM Observability > Langfuse is an open-source LLM engineering platform for tracing, prompt management, evaluation, and debugging AI apps. 24.1K+ GitHub stars. Self-hosted or cloud. MIT. ## Install Save in your project root: ## Quick Use ```bash # Install pip install langfuse openai # Add tracing to your LLM calls python -c " from langfuse.openai import openai # Automatic tracing — just swap the import client = openai.OpenAI() response = client.chat.completions.create( model='gpt-4o-mini', messages=[{'role': 'user', 'content': 'Hello!'}] ) print(response.choices[0].message.content) # Traces visible at http://localhost:3000 " # Self-host with Docker docker compose up -d # from langfuse repo ``` --- ## Intro Langfuse is an open-source LLM engineering platform that helps teams collaboratively develop, monitor, evaluate, and debug AI applications. With 24,100+ GitHub stars and MIT license, Langfuse provides LLM observability with detailed tracing of calls, retrieval, embeddings, and agent actions. It includes centralized prompt management with version control, evaluation pipelines (LLM-as-judge, user feedback, manual labeling), dataset testing, an interactive playground, and SDKs for Python and JavaScript. Self-host in minutes with Docker, Kubernetes, or deploy on AWS/Azure/GCP. **Best for**: Teams building production AI apps who need observability, prompt management, and evaluation **Works with**: Claude Code, OpenAI Codex, Cursor, Gemini CLI, Windsurf **Integrations**: OpenAI, LangChain, LlamaIndex, Haystack, Vercel AI SDK --- ## Key Features - **LLM tracing**: Track every call, retrieval, embedding, and agent action - **Prompt management**: Version control, collaboration, server/client caching - **Evaluations**: LLM-as-judge, user feedback, manual labeling, custom pipelines - **Datasets**: Benchmark testing for continuous improvement - **Playground**: Interactive prompt testing and iteration - **Self-hosted**: Docker, Kubernetes, Terraform (AWS/Azure/GCP) --- ### FAQ **Q: What is Langfuse?** A: Langfuse is an LLM observability platform with 24.1K+ stars for tracing, prompt management, and evaluation. Self-host with Docker or use cloud. MIT licensed. **Q: How do I install Langfuse?** A: `pip install langfuse openai`. Swap `from openai import OpenAI` to `from langfuse.openai import openai` for automatic tracing. --- ## Source & Thanks > Created by [Langfuse](https://github.com/langfuse). Licensed under MIT. > [langfuse/langfuse](https://github.com/langfuse/langfuse) — 24,100+ GitHub stars --- Source: https://tokrepo.com/en/workflows/49a8eb0b-b44b-46c2-b3c8-b54e55fb224f Author: AI Open Source