# Langfuse Prompt Management — Versioned Prompts + A/B Tests > Langfuse Prompt Management versions, labels, and A/B tests prompts. Edit in UI, fetch via SDK, swap models without code deploys. ## Install Copy the content below into your project: ## Quick Use 1. `pip install langfuse` 2. Set LANGFUSE_PUBLIC_KEY / SECRET_KEY / HOST in env 3. Author prompt in UI, label `production`, fetch with `lf.get_prompt(name, label='production')` --- ## Intro Langfuse Prompt Management stores prompts as versioned, labeled artifacts in Langfuse — edit in the UI, fetch from the SDK at runtime, swap models or templates without code deploys. Each prompt has a label (production, staging, experiment-v3) you point your code at. Best for: teams iterating on prompts faster than they can ship code, A/B testing prompts, granting non-engineers safe edit access. Works with: Python and JS SDKs, LangChain Hub-compatible, OpenAI and Anthropic message format. Setup time: 5 minutes. --- ### Push a prompt programmatically ```python from langfuse import Langfuse lf = Langfuse() # picks up LANGFUSE_PUBLIC_KEY / SECRET_KEY / HOST lf.create_prompt( name="support-triage", prompt=[ {"role": "system", "content": "You triage support tickets into urgent / billing / general."}, {"role": "user", "content": "Ticket: {{ticket_text}}"}, ], config={"model": "claude-3-5-sonnet-20241022", "temperature": 0.2}, labels=["production"], ) ``` ### Fetch + execute (with caching) ```python from langfuse import Langfuse from langfuse.openai import openai # auto-tracing wrapper lf = Langfuse() prompt = lf.get_prompt("support-triage", label="production", cache_ttl_seconds=60) messages = prompt.compile(ticket_text="My card was charged twice for order #4521") resp = openai.chat.completions.create( model=prompt.config["model"], messages=messages, temperature=prompt.config["temperature"], ) ``` ### A/B testing two prompt versions Tag two versions with separate labels (`production-a`, `production-b`), split traffic in code by user_id hash, then compare success metrics in the Langfuse Scores tab. ```python import hashlib def variant(user_id: str) -> str: return "production-a" if int(hashlib.md5(user_id.encode()).hexdigest(), 16) % 2 == 0 else "production-b" prompt = lf.get_prompt("support-triage", label=variant(user_id)) ``` ### Why labels not version pins | Approach | Behavior | |---|---| | `label="production"` | Latest production-tagged version. Edits in UI go live without code deploy. | | `version=7` | Hard pin. Code change required to move forward. Use only for compliance freezes. | ### Self-hosted Langfuse ```bash git clone https://github.com/langfuse/langfuse cd langfuse docker compose up -d # Open http://localhost:3000, create org + project, copy keys into .env ``` --- ### FAQ **Q: How is this different from LangChain Hub?** A: Langfuse Prompt Management ships with full traces — you see which prompt version produced which output, with cost and latency. LangChain Hub is registry-only with no observability. Langfuse also self-hosts; Hub is LangSmith-tied. **Q: Can non-engineers edit prompts safely?** A: Yes — give them Langfuse project access with the Editor role. They edit in UI, save creates a new version. Production label only moves when an admin promotes it, so junior edits can't ship without review. **Q: What's the cache_ttl_seconds default?** A: 60 seconds. The SDK fetches and caches; subsequent calls within 60s use the cache. Tune lower for fast-iteration dev (1s) or higher for stable prod (300s+) to reduce control-plane load. --- ## Source & Thanks > Built by [Langfuse](https://github.com/langfuse). Licensed under MIT. > > [langfuse/langfuse](https://github.com/langfuse/langfuse) — ⭐ 8,000+ --- ## 快速使用 1. `pip install langfuse` 2. 设 LANGFUSE_PUBLIC_KEY / SECRET_KEY / HOST 环境变量 3. UI 编辑 prompt 打 `production` 标签,用 `lf.get_prompt(name, label='production')` 拉取 --- ## 简介 Langfuse Prompt Management 把 prompt 存成 Langfuse 里的版本化、带标签的工件 —— UI 编辑、运行时从 SDK 拉、换模型或模板不用发版。每个 prompt 有标签(production / staging / experiment-v3),代码指向哪个。适合 prompt 迭代速度比代码发版快的团队、A/B 测 prompt、给非工程师安全编辑权限。兼容 Python / JS SDK、LangChain Hub 格式、OpenAI 和 Anthropic message 格式。装机时间 5 分钟。 --- ### 程序化推 prompt ```python from langfuse import Langfuse lf = Langfuse() # 读 LANGFUSE_PUBLIC_KEY / SECRET_KEY / HOST lf.create_prompt( name="support-triage", prompt=[ {"role": "system", "content": "把客服工单分流成 urgent / billing / general 三类。"}, {"role": "user", "content": "工单:{{ticket_text}}"}, ], config={"model": "claude-3-5-sonnet-20241022", "temperature": 0.2}, labels=["production"], ) ``` ### 拉取 + 执行(带缓存) ```python from langfuse import Langfuse from langfuse.openai import openai # 自动追踪 wrapper lf = Langfuse() prompt = lf.get_prompt("support-triage", label="production", cache_ttl_seconds=60) messages = prompt.compile(ticket_text="订单 #4521 我的卡被刷了两次") resp = openai.chat.completions.create( model=prompt.config["model"], messages=messages, temperature=prompt.config["temperature"], ) ``` ### 两版 prompt A/B 测 给两版打不同标签(`production-a` / `production-b`),代码里按 user_id hash 分流,在 Langfuse Scores 标签页对比成功指标。 ```python import hashlib def variant(user_id: str) -> str: return "production-a" if int(hashlib.md5(user_id.encode()).hexdigest(), 16) % 2 == 0 else "production-b" prompt = lf.get_prompt("support-triage", label=variant(user_id)) ``` ### 为啥用标签不用版本号锁定 | 方式 | 行为 | |---|---| | `label="production"` | 拿当前 production 标签的最新版。UI 编辑直接生效,不发版。| | `version=7` | 硬锁。要推进必须改代码。仅合规冻结期用。| ### 自托管 Langfuse ```bash git clone https://github.com/langfuse/langfuse cd langfuse docker compose up -d # 打开 http://localhost:3000,建组织+项目,把 key 复制到 .env ``` --- ### FAQ **Q: 跟 LangChain Hub 啥区别?** A: Langfuse Prompt Management 自带完整 trace —— 哪个 prompt 版本产出哪个结果,附成本和延迟。LangChain Hub 仅注册表,没观测。Langfuse 还能自托管,Hub 绑 LangSmith。 **Q: 非工程师能安全编辑 prompt 吗?** A: 能 —— 给他们 Langfuse 项目 Editor 角色。UI 编辑,保存即新建版本。production 标签只能由 admin 推进,所以初级编辑不评审上不了线。 **Q: cache_ttl_seconds 默认多少?** A: 60 秒。SDK 拉取并缓存;60 秒内的后续调用用缓存。开发态调小(1 秒)快迭代,生产态调大(300+)减控制面压力。 --- ## 来源与感谢 > Built by [Langfuse](https://github.com/langfuse). Licensed under MIT. > > [langfuse/langfuse](https://github.com/langfuse/langfuse) — ⭐ 8,000+ --- Source: https://tokrepo.com/en/workflows/langfuse-prompt-management-versioned-prompts-a-b-tests Author: Langfuse