How to use conversation history effectively
Quick answer
Use
conversation history by passing previous messages as a list of role and content pairs to the messages parameter in chat completions. This maintains context across turns, enabling the model to generate coherent and relevant responses.PREREQUISITES
Python 3.8+OpenAI API key (free tier works)pip install openai>=1.0
Setup
Install the openai Python package and set your API key as an environment variable for secure access.
pip install openai>=1.0 Step by step
Pass conversation history as a list of messages with roles user, assistant, and optionally system. This example shows a simple chat with history to maintain context.
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
conversation_history = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, who won the 2024 Olympics 100m sprint?"},
{"role": "assistant", "content": "The winner of the 2024 Olympics 100m sprint was John Doe."},
{"role": "user", "content": "What was his winning time?"}
]
response = client.chat.completions.create(
model="gpt-4o",
messages=conversation_history
)
print(response.choices[0].message.content) output
The winning time for John Doe in the 2024 Olympics 100m sprint was 9.79 seconds.
Common variations
You can use conversation history with different models like claude-3-5-sonnet-20241022 or stream responses for real-time output. Async calls also support history similarly.
import os
from anthropic import Anthropic
client = Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
history = [
{"role": "user", "content": "Explain photosynthesis."},
{"role": "assistant", "content": "Photosynthesis is the process by which plants convert light into energy."},
{"role": "user", "content": "What are the main stages?"}
]
message = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=200,
system="You are a helpful assistant.",
messages=history
)
print(message.content[0].text) output
The main stages of photosynthesis are the light-dependent reactions and the Calvin cycle.
Troubleshooting
If the model forgets context, ensure you include all relevant previous messages in the messages list. Avoid exceeding token limits by truncating older history or summarizing it.
Key Takeaways
- Always include prior messages in the
messagesparameter to maintain context. - Use
systemrole messages to set behavior and tone for the assistant. - Manage token limits by trimming or summarizing long conversation histories.
- Conversation history works consistently across OpenAI and Anthropic SDKs with similar message structures.