How to beginner · 3 min read

How to version prompts in Langfuse

Quick answer
Use Langfuse’s tagging and metadata features to version prompts by assigning version identifiers as tags or custom metadata when invoking your LLM calls. This enables tracking prompt changes over time and comparing model outputs across versions with the @observe decorator or manual event logging.

PREREQUISITES

  • Python 3.8+
  • pip install langfuse
  • Langfuse API keys (public and secret)
  • Basic familiarity with Python decorators

Setup

Install the langfuse Python package and set your API keys as environment variables to enable prompt version tracking.

bash
pip install langfuse

Step by step

Initialize the Langfuse client with your API keys, then use the @observe decorator to wrap your prompt function. Pass a tags parameter with a version string to track prompt versions automatically.

python
import os
from langfuse import Langfuse
from langfuse.decorators import observe

# Initialize Langfuse client
langfuse = Langfuse(
    public_key=os.environ["LANGFUSE_PUBLIC_KEY"],
    secret_key=os.environ["LANGFUSE_SECRET_KEY"],
    host="https://cloud.langfuse.com"
)

@observe(tags=["prompt_version:v1.0"])
def generate_response(prompt: str) -> str:
    # Simulate LLM call
    response = f"Response to: {prompt}"
    return response

if __name__ == "__main__":
    output = generate_response("What is Langfuse?")
    print(output)
output
Response to: What is Langfuse?

Common variations

You can version prompts by changing the tag value (e.g., prompt_version:v1.1) or by adding custom metadata using the langfuse_context decorator for more detailed tracking. Async functions are supported similarly.

python
from langfuse.decorators import langfuse_context

@langfuse_context(metadata={"prompt_version": "v2.0"})
@observe()
def generate_response_v2(prompt: str) -> str:
    return f"[v2] Response to: {prompt}"

if __name__ == "__main__":
    output = generate_response_v2("How to version prompts?")
    print(output)
output
[v2] Response to: How to version prompts?

Troubleshooting

  • If you don’t see prompt versions in the Langfuse dashboard, verify your public_key and secret_key environment variables are set correctly.
  • Ensure the @observe decorator is applied to the function making the prompt call.
  • For missing metadata, confirm you use @langfuse_context properly and that your Langfuse client is initialized before function calls.

Key Takeaways

  • Use @observe with tags to version prompts simply in Langfuse.
  • Add detailed version info via @langfuse_context metadata for richer tracking.
  • Always initialize Langfuse client with correct API keys before usage.
Verified 2026-04
Verify ↗