How to use DeepSeek with OpenAI SDK
Quick answer
Use the OpenAI SDK with the base_url parameter set to DeepSeek's API endpoint and your DEEPSEEK_API_KEY. Call client.chat.completions.create() with the deepseek-chat model and your messages to get AI completions from DeepSeek.
PREREQUISITES
Python 3.8+DeepSeek API keypip install openai>=1.0
Setup
Install the OpenAI Python SDK and set your DeepSeek API key as an environment variable. Use the base_url parameter to point the client to DeepSeek's API endpoint.
pip install openai>=1.0 Step by step
This example shows how to create a DeepSeek client using the OpenAI SDK and send a chat completion request to the deepseek-chat model.
import os
from openai import OpenAI
client = OpenAI(
api_key=os.environ["DEEPSEEK_API_KEY"],
base_url="https://api.deepseek.com"
)
response = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Hello, DeepSeek!"}]
)
print(response.choices[0].message.content) output
Hello! How can I assist you today?
Common variations
- Use different DeepSeek models like
deepseek-reasonerby changing themodelparameter. - Implement streaming completions by using the SDK's streaming options.
- Use async calls with Python
asyncioand the OpenAI SDK's async client if needed.
Troubleshooting
- If you get authentication errors, verify your
DEEPSEEK_API_KEYenvironment variable is set correctly. - Check network connectivity to
https://api.deepseek.com. - Ensure you are using the latest OpenAI SDK version to avoid compatibility issues.
Key Takeaways
- Use the OpenAI SDK with base_url="https://api.deepseek.com" to access DeepSeek models.
- Set your DeepSeek API key in os.environ["DEEPSEEK_API_KEY"] for authentication.
- Call client.chat.completions.create() with model="deepseek-chat" for chat completions.
- You can switch models or enable streaming by adjusting parameters in the SDK call.
- Verify environment variables and network access if you encounter errors.