How to get started with Langfuse
Quick answer
Use the
langfuse Python SDK to integrate AI observability by initializing Langfuse with your API keys from environment variables. Decorate your AI call functions with @observe() to automatically track and log model interactions.PREREQUISITES
Python 3.8+Langfuse API keys (public and secret)pip install langfuse
Setup
Install the langfuse Python package and set your API keys as environment variables for secure authentication.
pip install langfuse Step by step
Initialize the Langfuse client with your public_key and secret_key from environment variables. Use the @observe() decorator on your AI function to automatically trace calls and responses.
import os
from langfuse import Langfuse
from langfuse.decorators import observe
langfuse = Langfuse(
public_key=os.environ["LANGFUSE_PUBLIC_KEY"],
secret_key=os.environ["LANGFUSE_SECRET_KEY"],
host="https://cloud.langfuse.com"
)
@observe()
def call_ai_model(prompt: str) -> str:
# Simulate AI call (replace with actual AI client call)
response = f"Response to: {prompt}"
return response
if __name__ == "__main__":
result = call_ai_model("What is Langfuse?")
print(result) output
Response to: What is Langfuse?
Common variations
You can use Langfuse with different AI models by wrapping their call functions with @observe(). For asynchronous calls, use @observe() on async functions. You can also configure the host parameter if using a self-hosted Langfuse instance.
import asyncio
from langfuse.decorators import observe
@observe()
async def async_ai_call(prompt: str) -> str:
# Simulate async AI call
await asyncio.sleep(0.1)
return f"Async response to: {prompt}"
async def main():
response = await async_ai_call("Async Langfuse example")
print(response)
if __name__ == "__main__":
asyncio.run(main()) output
Async response to: Async Langfuse example
Troubleshooting
- If you see authentication errors, verify your
LANGFUSE_PUBLIC_KEYandLANGFUSE_SECRET_KEYenvironment variables are set correctly. - If no data appears in the Langfuse dashboard, ensure your decorated functions are actually called and that the
Langfuseclient is properly initialized. - For network issues, check your internet connection and firewall settings allowing access to
https://cloud.langfuse.com.
Key Takeaways
- Install the
langfusepackage and set API keys via environment variables for secure usage. - Use the
@observe()decorator to automatically trace AI model calls and responses. - Langfuse supports both synchronous and asynchronous function tracing in Python.
- Verify environment variables and network access if observability data does not appear.
- Customize the
hostparameter for self-hosted Langfuse deployments.