Langfuse trace structure explained
Quick answer
The
Langfuse trace structure organizes AI interactions into hierarchical spans representing individual LLM calls, tool invocations, and user interactions. Each trace includes metadata like timestamps, inputs, outputs, and custom tags, enabling detailed observability and debugging of AI workflows.PREREQUISITES
Python 3.8+Langfuse account and API keyspip install langfuse
Setup
Install the langfuse Python package and configure your API keys as environment variables. Initialize the Langfuse client with your public_key and secret_key to start tracing AI calls.
import os
from langfuse import Langfuse
langfuse = Langfuse(
public_key=os.environ["LANGFUSE_PUBLIC_KEY"],
secret_key=os.environ["LANGFUSE_SECRET_KEY"],
host="https://cloud.langfuse.com"
) Step by step
Use the @observe() decorator to automatically trace function calls that interact with AI models. Each trace captures inputs, outputs, timestamps, and metadata in a structured format. You can also create manual trace contexts with @langfuse_context for more control.
from langfuse.decorators import observe
@observe()
def generate_response(prompt: str) -> str:
# Simulate an AI call
response = f"Response to: {prompt}"
return response
if __name__ == "__main__":
result = generate_response("Hello Langfuse")
print(result) output
Response to: Hello Langfuse
Common variations
You can trace asynchronous functions by applying @observe() to async defs. Langfuse supports tracing OpenAI and other AI SDK calls automatically by wrapping clients. Customize trace metadata with tags or add nested spans for tool calls.
import asyncio
from langfuse.decorators import observe
@observe()
async def async_generate(prompt: str) -> str:
await asyncio.sleep(0.1) # Simulate async AI call
return f"Async response to: {prompt}"
async def main():
result = await async_generate("Async Langfuse")
print(result)
if __name__ == "__main__":
asyncio.run(main()) output
Async response to: Async Langfuse
Troubleshooting
- If traces do not appear in the Langfuse dashboard, verify your
LANGFUSE_PUBLIC_KEYandLANGFUSE_SECRET_KEYenvironment variables are set correctly. - Ensure network connectivity to
https://cloud.langfuse.com. - Check for exceptions in your traced functions; unhandled errors may prevent trace completion.
Key Takeaways
- Use the
Langfuseclient with your API keys to enable tracing. - Decorate AI interaction functions with
@observe()to capture structured traces automatically. - Traces include hierarchical spans with inputs, outputs, timestamps, and metadata for detailed observability.
- Langfuse supports both synchronous and asynchronous tracing with customizable context.
- Verify environment variables and network access if traces do not appear.