How to beginner · 3 min read

How to save chat history in LlamaIndex

Quick answer
To save chat history in LlamaIndex, serialize the ChatMessage objects or the entire ConversationBufferMemory to a file (e.g., JSON). Later, reload this data to restore the conversation context for persistent chat sessions.

PREREQUISITES

  • Python 3.8+
  • pip install llama-index>=0.6.0
  • Basic knowledge of Python file I/O

Setup

Install llama-index via pip and import necessary classes for chat memory management.

bash
pip install llama-index>=0.6.0

Step by step

This example demonstrates creating a chat memory buffer, saving its history to a JSON file, and loading it back to continue the conversation.

python
import json
from llama_index.memory import ConversationBufferMemory
from llama_index.schema import ChatMessage

# Initialize memory
memory = ConversationBufferMemory()

# Simulate adding chat messages
memory.chat_memory.add_message(ChatMessage(role="user", content="Hello!"))
memory.chat_memory.add_message(ChatMessage(role="assistant", content="Hi! How can I help you?"))

# Serialize chat history to JSON
chat_history = [
    {"role": msg.role, "content": msg.content}
    for msg in memory.chat_memory.messages
]

with open("chat_history.json", "w", encoding="utf-8") as f:
    json.dump(chat_history, f, indent=2)

print("Chat history saved to chat_history.json")

# Later, load chat history from JSON
with open("chat_history.json", "r", encoding="utf-8") as f:
    loaded_history = json.load(f)

# Rebuild chat memory
new_memory = ConversationBufferMemory()
for msg in loaded_history:
    new_memory.chat_memory.add_message(ChatMessage(role=msg["role"], content=msg["content"]))

# Verify loaded messages
for msg in new_memory.chat_memory.messages:
    print(f"{msg.role}: {msg.content}")
output
Chat history saved to chat_history.json
user: Hello!
assistant: Hi! How can I help you?

Common variations

  • Use other storage formats like SQLite or Pickle for more complex persistence.
  • Integrate with llama-index query engines to maintain context across queries.
  • Use async file I/O for non-blocking chat history saving in async applications.

Troubleshooting

  • If chat history fails to load, verify JSON file integrity and correct file path.
  • Ensure ChatMessage roles and content are correctly serialized and deserialized.
  • For large histories, consider incremental saving to avoid data loss.

Key Takeaways

  • Serialize ConversationBufferMemory messages to JSON for easy chat history saving.
  • Reload saved messages into ConversationBufferMemory to restore chat context.
  • Consider storage format and size when choosing how to save chat history.
Verified 2026-04
Verify ↗