How to beginner · 3 min read

How to use RunnableWithMessageHistory in LangChain

Quick answer
Use RunnableWithMessageHistory in LangChain to create a runnable chain that automatically manages and persists chat message history. It wraps a runnable (like an LLM chain) and stores messages, enabling stateful chat interactions with minimal code.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langchain_openai langchain_community

Setup

Install the necessary LangChain packages and set your OpenAI API key in the environment variables.

bash
pip install langchain_openai langchain_community

Step by step

This example demonstrates how to create a RunnableWithMessageHistory wrapping a simple OpenAI chat model runnable. It shows how to run the chain with message history automatically managed.

python
import os
from langchain_openai import ChatOpenAI
from langchain_community.runnables import RunnableWithMessageHistory

# Initialize the chat model runnable
chat = ChatOpenAI(model_name="gpt-4o", temperature=0, openai_api_key=os.environ["OPENAI_API_KEY"])

# Wrap the chat runnable with message history
runnable_with_history = RunnableWithMessageHistory(runnable=chat)

# Run the chain with an initial user message
response1 = runnable_with_history.invoke("Hello, who won the world series in 2020?")
print("Response 1:", response1)

# Run the chain again with a follow-up question, history is preserved
response2 = runnable_with_history.invoke("Where was it played?")
print("Response 2:", response2)
output
Response 1: The Los Angeles Dodgers won the World Series in 2020.
Response 2: The 2020 World Series was played at Globe Life Field in Arlington, Texas.

Common variations

  • Use different chat models by changing model_name in ChatOpenAI.
  • Use async invocation with await runnable_with_history.ainvoke(...) in async functions.
  • Customize message history storage by subclassing RunnableWithMessageHistory or passing a custom history object.

Troubleshooting

  • If you get authentication errors, verify your OPENAI_API_KEY environment variable is set correctly.
  • If message history is not preserved, ensure you are using the same RunnableWithMessageHistory instance across calls.
  • For unexpected model errors, check the model name and API usage limits.

Key Takeaways

  • Use RunnableWithMessageHistory to automatically manage chat message history in LangChain runnables.
  • Wrap any runnable chat model to enable stateful conversations with minimal code changes.
  • Maintain the same RunnableWithMessageHistory instance to preserve conversation context across calls.
Verified 2026-04 · gpt-4o
Verify ↗