How to beginner · 3 min read

How to use ConversationBufferMemory in LangChain

Quick answer
Use ConversationBufferMemory in LangChain to store and retrieve chat history during conversations. Instantiate it and pass it to your chat chain to maintain context across user interactions seamlessly.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langchain_openai langchain_community

Setup

Install the necessary LangChain packages and set your OpenAI API key in the environment.

bash
pip install langchain_openai langchain_community

Step by step

This example shows how to create a chat chain with ConversationBufferMemory to keep track of the conversation history.

python
import os
from langchain_openai import ChatOpenAI
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory

# Initialize the chat model
chat_model = ChatOpenAI(model_name="gpt-4o", temperature=0.7, openai_api_key=os.environ["OPENAI_API_KEY"])

# Create ConversationBufferMemory instance
memory = ConversationBufferMemory()

# Create a conversation chain with memory
conversation = ConversationChain(llm=chat_model, memory=memory)

# Simulate a conversation
print(conversation.run("Hello, who won the world series in 2020?"))
print(conversation.run("Where was it played?"))

# Access the conversation history
print("\nConversation history:")
print(memory.buffer)
output
The Los Angeles Dodgers won the World Series in 2020.
The 2020 World Series was played at Globe Life Field in Arlington, Texas.

Conversation history:
Human: Hello, who won the world series in 2020?
AI: The Los Angeles Dodgers won the World Series in 2020.
Human: Where was it played?
AI: The 2020 World Series was played at Globe Life Field in Arlington, Texas.

Common variations

  • Use ConversationBufferMemory with different LLMs like Anthropic Claude or Google Gemini by swapping the llm parameter.
  • Use ConversationSummaryMemory for summarized context instead of full buffer.
  • Implement async calls by using async LangChain methods if supported.

Troubleshooting

  • If conversation history is empty, ensure ConversationBufferMemory is properly passed to the chain.
  • Check your API key environment variable OPENAI_API_KEY is set correctly.
  • For large conversations, buffer memory can grow large; consider using summarized memory variants.

Key Takeaways

  • Use ConversationBufferMemory to maintain full chat history in LangChain conversations.
  • Pass the memory instance to your conversation chain to enable context retention across calls.
  • Access the conversation history anytime via the memory's buffer attribute.
  • Switch memory types or LLMs easily for different use cases or performance needs.
Verified 2026-04 · gpt-4o
Verify ↗