How to beginner · 3 min read

Langfuse dashboard explained

Quick answer
The Langfuse dashboard is a centralized observability platform that tracks and visualizes your AI model interactions, including prompts, responses, and metadata. Use the langfuse Python SDK to instrument your code, enabling detailed tracing and analytics of your LLM calls and workflows.

PREREQUISITES

  • Python 3.8+
  • Langfuse API keys (public and secret)
  • pip install langfuse

Setup

Install the langfuse Python package and set your API keys as environment variables for secure authentication.

  • Install Langfuse SDK: pip install langfuse
  • Set environment variables LANGFUSE_PUBLIC_KEY and LANGFUSE_SECRET_KEY
python
import os
os.environ["LANGFUSE_PUBLIC_KEY"] = os.environ.get("LANGFUSE_PUBLIC_KEY", "your_public_key_here")
os.environ["LANGFUSE_SECRET_KEY"] = os.environ.get("LANGFUSE_SECRET_KEY", "your_secret_key_here")

Step by step

Initialize the Langfuse client and use the @observe() decorator to automatically trace your AI function calls. This example shows a simple LLM call wrapped for observability.

python
from langfuse import Langfuse
from langfuse.decorators import observe
import os

langfuse = Langfuse(
    public_key=os.environ["LANGFUSE_PUBLIC_KEY"],
    secret_key=os.environ["LANGFUSE_SECRET_KEY"],
    host="https://cloud.langfuse.com"
)

@observe()
def generate_response(prompt: str) -> str:
    # Simulate an LLM call
    response = f"Response to: {prompt}"
    return response

if __name__ == "__main__":
    answer = generate_response("What is Langfuse?")
    print(answer)
output
Response to: What is Langfuse?

Common variations

You can integrate Langfuse with OpenAI or other LLM clients by using the Langfuse OpenAI wrapper for automatic tracing. Also, async functions and manual context management are supported.

python
from langfuse.openai import openai  # drop-in replacement for OpenAI SDK
from openai import OpenAI
import os

client = openai.OpenAI(
    api_key=os.environ["OPENAI_API_KEY"],
    public_key=os.environ["LANGFUSE_PUBLIC_KEY"],
    secret_key=os.environ["LANGFUSE_SECRET_KEY"]
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Explain Langfuse dashboard."}]
)
print(response.choices[0].message.content)
output
Langfuse dashboard provides detailed tracing and analytics for your AI model calls, helping you monitor performance and usage.

Troubleshooting

  • If you see no data in the dashboard, verify your LANGFUSE_PUBLIC_KEY and LANGFUSE_SECRET_KEY are correctly set.
  • Ensure network connectivity to https://cloud.langfuse.com.
  • Check that your functions are decorated with @observe() or that you use the Langfuse client wrappers properly.

Key Takeaways

  • Use the langfuse Python SDK to instrument AI calls for detailed observability.
  • Set LANGFUSE_PUBLIC_KEY and LANGFUSE_SECRET_KEY environment variables for authentication.
  • Decorate your AI functions with @observe() to automatically trace and send data to the Langfuse dashboard.
  • Langfuse integrates seamlessly with OpenAI SDK via a drop-in wrapper for automatic tracing.
  • Verify keys and network access if dashboard data does not appear.
Verified 2026-04 · gpt-4o
Verify ↗