How to intermediate · 3 min read

LangSmith self-hosted deployment

Quick answer
To deploy LangSmith self-hosted, you need to clone the official LangSmith repository and run the backend and UI services locally or on your server. Configure environment variables for API keys and database connections, then start the services using Docker or Python scripts to enable full AI traceability without cloud dependencies.

PREREQUISITES

  • Python 3.8+
  • Docker and Docker Compose installed
  • PostgreSQL database accessible
  • LangSmith API key (optional for cloud integration)
  • pip install langsmith-core langsmith-ui (if not using Docker)

Setup

Install Docker and Docker Compose to run LangSmith self-hosted services. Prepare a PostgreSQL database for LangSmith's backend storage. Clone the LangSmith GitHub repository to get the latest source code and configuration files.

bash
git clone https://github.com/langsmith/langsmith.git
cd langsmith
# Create .env file with database and API key settings
# Example .env content:
# DATABASE_URL=postgresql://user:password@localhost:5432/langsmith
# LANGSMITH_API_KEY=your_api_key

# Start services with Docker Compose
docker-compose up -d
output
Creating network "langsmith_default" with the default driver
Creating langsmith_backend_1 ... done
Creating langsmith_ui_1      ... done

Step by step

Run LangSmith backend and UI services locally or on your server. Configure environment variables for database connection and API keys. Use the LangSmith Python SDK to connect your AI applications to the self-hosted LangSmith instance for tracing and observability.

python
import os
from langsmith import Client

# Set environment variables or load from .env
os.environ["LANGSMITH_API_KEY"] = os.environ.get("LANGSMITH_API_KEY", "your_api_key")
os.environ["LANGSMITH_API_URL"] = os.environ.get("LANGSMITH_API_URL", "http://localhost:8000/api")

# Initialize LangSmith client pointing to self-hosted URL
client = Client(api_key=os.environ["LANGSMITH_API_KEY"], host=os.environ["LANGSMITH_API_URL"])

# Example: Trace a simple LLM call
@client.traceable()
def generate_text(prompt: str) -> str:
    # Simulate LLM call
    return f"Response to: {prompt}"

result = generate_text("Hello LangSmith")
print("Traced output:", result)
output
Traced output: Response to: Hello LangSmith

Common variations

You can run LangSmith self-hosted asynchronously or integrate with LangChain by configuring the LANGSMITH_API_URL and LANGSMITH_API_KEY environment variables. Use Docker Compose for easy deployment or run backend and UI separately with Python for custom setups.

python
# Async example with LangSmith client
import asyncio
from langsmith import Client

async def main():
    client = Client(api_key=os.environ["LANGSMITH_API_KEY"], host=os.environ["LANGSMITH_API_URL"])

    @client.traceable()
    async def async_generate(prompt: str) -> str:
        return f"Async response to: {prompt}"

    result = await async_generate("Async LangSmith")
    print("Async traced output:", result)

asyncio.run(main())
output
Async traced output: Async response to: Async LangSmith

Troubleshooting

  • If the UI does not load, verify Docker containers are running with docker ps and check logs with docker-compose logs.
  • Database connection errors usually mean incorrect DATABASE_URL in the .env file; ensure credentials and host are correct.
  • API key authentication failures require matching LANGSMITH_API_KEY in both client and server environment variables.

Key Takeaways

  • Use Docker Compose for the simplest LangSmith self-hosted deployment with PostgreSQL.
  • Configure LANGSMITH_API_URL and LANGSMITH_API_KEY to connect your AI apps to the self-hosted instance.
  • LangSmith supports synchronous and asynchronous tracing with the same client API.
  • Check container logs and environment variables to resolve common deployment issues.
Verified 2026-04
Verify ↗