How to beginner · 3 min read

LangSmith trace metadata explained

Quick answer
In LangSmith, trace metadata captures detailed context about AI calls, including prompt inputs, model parameters, and execution environment. This metadata enables observability and debugging by providing structured information alongside each LLM invocation.

PREREQUISITES

  • Python 3.8+
  • LangSmith API key
  • pip install langsmith

Setup

Install the langsmith Python package and set your API key as an environment variable to enable tracing of AI calls.

bash
pip install langsmith

Step by step

Use the Langsmith client to trace AI calls. Metadata fields include prompt, model, temperature, tokens, and duration_ms. These provide insights into the input, model configuration, token usage, and latency.

python
import os
from langsmith import Client, traceable

# Initialize LangSmith client with API key
client = Client(api_key=os.environ["LANGSMITH_API_KEY"])

@traceable
# Define a traced function wrapping an AI call

def generate_text(prompt: str) -> str:
    # Simulate an AI call (replace with actual LLM call)
    response = f"Response to: {prompt}"
    return response

# Run the traced function
result = generate_text("Explain LangSmith trace metadata")
print("Generated text:", result)
output
Generated text: Response to: Explain LangSmith trace metadata

Common variations

You can add custom metadata fields such as user_id or session_id to enrich traces. Async tracing and integration with LangChain or OpenAI SDKs are supported by decorating or wrapping calls similarly.

python
import os
import asyncio
from langsmith import Client, traceable

client = Client(api_key=os.environ["LANGSMITH_API_KEY"])

@traceable
async def async_generate(prompt: str) -> str:
    # Simulate async AI call
    await asyncio.sleep(0.1)
    return f"Async response to: {prompt}"

async def main():
    result = await async_generate("Async LangSmith trace example")
    print("Async generated text:", result)

asyncio.run(main())
output
Async generated text: Async response to: Async LangSmith trace example

Troubleshooting

  • If traces do not appear in the LangSmith dashboard, verify your LANGSMITH_API_KEY environment variable is set correctly.
  • Ensure your traced functions are decorated with @traceable to capture metadata.
  • Check network connectivity if using remote tracing.

Key Takeaways

  • Use @traceable decorator to automatically capture trace metadata in LangSmith.
  • Trace metadata includes prompt, model parameters, token usage, and timing for observability.
  • Add custom metadata fields to enrich trace context for debugging and analysis.
Verified 2026-04
Verify ↗