How to migrate from OpenAI to Claude API
Quick answer
To migrate from the OpenAI API to the Claude API, replace the OpenAI client with the Anthropic client and adjust your code to use
anthropic.Anthropic with the system parameter instead of messages roles. Use client.messages.create with the Claude model like claude-3-5-sonnet-20241022 and adapt message formatting accordingly.PREREQUISITES
Python 3.8+OpenAI API key (for existing code)Anthropic API keypip install openai>=1.0pip install anthropic>=0.20
Setup
Install both the OpenAI and Anthropic Python SDKs and set your environment variables for API keys.
- Install OpenAI SDK:
pip install openai>=1.0 - Install Anthropic SDK:
pip install anthropic>=0.20 - Set environment variables:
export OPENAI_API_KEY='your_openai_key'export ANTHROPIC_API_KEY='your_claude_key'
pip install openai>=1.0 anthropic>=0.20 Step by step
Here is a complete runnable example showing how to migrate a simple chat completion from OpenAI to Claude using Python.
import os
from openai import OpenAI
import anthropic
# OpenAI example
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello, how are you?"}]
)
print("OpenAI response:", response.choices[0].message.content)
# Anthropic Claude example
client = anthropic.Anthropic()
message = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=300,
system="You are a helpful assistant.",
messages=[{"role": "user", "content": "Hello, how are you?"}]
)
print("Claude response:", message.content[0].text) output
OpenAI response: I'm doing well, thank you! How can I assist you today? Claude response: I'm doing great, thanks for asking! How can I help you today?
Common variations
You can customize the migration by using different Claude models, enabling streaming, or using async calls.
- Use
claude-3-5-haiku-20241022for a lighter model. - Use async with
asyncioandawait client.messages.acreate(...). - Streaming is supported via
stream=Trueinmessages.create.
import asyncio
import anthropic
async def async_claude_call():
client = anthropic.Anthropic()
response = await client.messages.acreate(
model="claude-3-5-sonnet-20241022",
max_tokens=200,
system="You are a helpful assistant.",
messages=[{"role": "user", "content": "Tell me a joke."}]
)
print("Async Claude response:", response.content[0].text)
asyncio.run(async_claude_call()) output
Async Claude response: Why did the scarecrow win an award? Because he was outstanding in his field!
Troubleshooting
If you see authentication errors, verify your API keys are set correctly in environment variables. If the response is empty or incomplete, check max_tokens and model compatibility. For rate limits, implement retries with exponential backoff.
Key Takeaways
- Replace OpenAI client with Anthropic client and adjust message format using the system parameter.
- Use Claude models like
claude-3-5-sonnet-20241022for best coding and chat performance. - Set API keys securely via environment variables for both OpenAI and Anthropic.
- Async and streaming calls are supported in the Anthropic SDK for advanced use cases.
- Check max_tokens and model names carefully to avoid runtime errors during migration.