How to beginner · 3 min read

How to add memory to LangChain chatbot

Quick answer
Add memory to a LangChain chatbot by using memory classes like ConversationBufferMemory or ConversationSummaryMemory from langchain.memory. Integrate the memory instance into your ChatOpenAI chain to maintain conversation context across turns.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langchain_openai langchain_community

Setup

Install the required packages and set your OpenAI API key as an environment variable.

bash
pip install langchain_openai langchain_community

Step by step

This example shows how to create a LangChain chatbot with ConversationBufferMemory to keep track of the conversation history.

python
import os
from langchain_openai import ChatOpenAI
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain

# Set your OpenAI API key in environment variable OPENAI_API_KEY

# Initialize the chat model
chat = ChatOpenAI(model="gpt-4o-mini", temperature=0)

# Create a memory instance to store conversation history
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

# Create a conversation chain with memory
conversation = ConversationChain(llm=chat, memory=memory)

# Chat with memory
print(conversation.run("Hello! Who won the world series in 2020?"))
print(conversation.run("Where was it played?"))
output
The Los Angeles Dodgers won the World Series in 2020.
The 2020 World Series was played at Globe Life Field in Arlington, Texas.

Common variations

  • Use ConversationSummaryMemory for summarizing long conversations to save token usage.
  • Use async versions of LangChain components for asynchronous chatbots.
  • Swap ChatOpenAI model to gpt-4o or other supported models for better performance.
python
from langchain.memory import ConversationSummaryMemory

# Example using summary memory
summary_memory = ConversationSummaryMemory(llm=chat, memory_key="summary")
conversation_with_summary = ConversationChain(llm=chat, memory=summary_memory)

print(conversation_with_summary.run("Tell me about the Eiffel Tower."))
print(conversation_with_summary.run("When was it built?"))
output
The Eiffel Tower is a wrought-iron lattice tower in Paris, France, built as the entrance arch to the 1889 World's Fair.
The Eiffel Tower was constructed between 1887 and 1889.

Troubleshooting

  • If conversation history is not retained, ensure memory is passed to the ConversationChain.
  • Check that return_messages=True is set in memory to keep messages in chat format.
  • For token limit errors, switch to ConversationSummaryMemory or truncate history manually.

Key Takeaways

  • Use LangChain memory classes like ConversationBufferMemory to add persistent chat context.
  • Pass the memory instance to ConversationChain to enable memory in your chatbot.
  • Switch to summary memory to manage token limits in long conversations.
Verified 2026-04 · gpt-4o-mini, gpt-4o
Verify ↗