How to beginner · 3 min read

How to use DeepSeek on Groq

Quick answer
Use the openai Python SDK with base_url="https://api.deepseek.com" and your DeepSeek API key to call the deepseek-chat model on Groq. Instantiate OpenAI with the DeepSeek key and base URL, then create chat completions with client.chat.completions.create().

PREREQUISITES

  • Python 3.8+
  • DeepSeek API key
  • pip install openai>=1.0

Setup

Install the openai Python package and set your DeepSeek API key as an environment variable.

  • Install SDK: pip install openai
  • Set environment variable: export DEEPSEEK_API_KEY=your_api_key (Linux/macOS) or setx DEEPSEEK_API_KEY your_api_key (Windows)
bash
pip install openai

Step by step

Use the OpenAI-compatible SDK with the DeepSeek base URL to call the deepseek-chat model. Pass your messages as usual to get chat completions.

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["DEEPSEEK_API_KEY"], base_url="https://api.deepseek.com")

messages = [
    {"role": "user", "content": "Explain the benefits of using DeepSeek on Groq."}
]

response = client.chat.completions.create(
    model="deepseek-chat",
    messages=messages
)

print(response.choices[0].message.content)
output
DeepSeek on Groq offers fast, scalable access to advanced LLMs with OpenAI-compatible APIs, enabling seamless integration and powerful AI capabilities.

Common variations

You can use streaming to receive tokens as they are generated or switch to other DeepSeek models like deepseek-reasoner for reasoning tasks. Async usage is also supported with async and await.

python
import asyncio
from openai import OpenAI

async def main():
    client = OpenAI(api_key=os.environ["DEEPSEEK_API_KEY"], base_url="https://api.deepseek.com")
    
    stream = client.chat.completions.create(
        model="deepseek-chat",
        messages=[{"role": "user", "content": "Stream a summary of DeepSeek."}],
        stream=True
    )

    async for chunk in stream:
        print(chunk.choices[0].delta.content or "", end="", flush=True)

asyncio.run(main())
output
DeepSeek is a powerful AI platform providing fast and reliable chat completions via an OpenAI-compatible API...

Troubleshooting

  • If you get authentication errors, verify your DEEPSEEK_API_KEY environment variable is set correctly.
  • For connection issues, ensure your network allows access to https://api.deepseek.com.
  • If the model is not found, confirm you are using the correct model name deepseek-chat.

Key Takeaways

  • Use the OpenAI SDK with base_url set to DeepSeek's API endpoint for Groq integration.
  • The model name for chat is deepseek-chat and supports standard OpenAI chat completions calls.
  • Streaming and async calls are supported for efficient real-time responses.
  • Always set your API key in the environment variable DEEPSEEK_API_KEY to avoid authentication errors.
Verified 2026-04 · deepseek-chat, deepseek-reasoner
Verify ↗