How to beginner · 3 min read

How to use Mixtral on Groq

Quick answer
Use the OpenAI Python SDK with your Groq API key and set base_url="https://api.groq.com/openai/v1". Call client.chat.completions.create with model="mixtral-8x7b-32768" and your messages to interact with Mixtral on Groq.

PREREQUISITES

  • Python 3.8+
  • Groq API key (set as environment variable GROQ_API_KEY)
  • pip install openai>=1.0

Setup

Install the official openai Python package and set your Groq API key as an environment variable.

  • Run pip install openai to install the SDK.
  • Export your API key: export GROQ_API_KEY=your_api_key_here (Linux/macOS) or set it in your environment variables on Windows.
bash
pip install openai

Step by step

Use the OpenAI SDK with the Groq base URL and call the Mixtral model for chat completions. Below is a complete runnable example.

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["GROQ_API_KEY"], base_url="https://api.groq.com/openai/v1")

messages = [
    {"role": "user", "content": "Explain the benefits of using Mixtral on Groq."}
]

response = client.chat.completions.create(
    model="mixtral-8x7b-32768",
    messages=messages
)

print(response.choices[0].message.content)
output
Mixtral on Groq offers high-performance inference with optimized hardware acceleration, enabling fast and cost-effective large language model deployments.

Common variations

You can also use streaming to receive tokens as they are generated or switch to other Groq models by changing the model parameter.

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["GROQ_API_KEY"], base_url="https://api.groq.com/openai/v1")

messages = [
    {"role": "user", "content": "Tell me a joke about AI."}
]

# Streaming example
stream = client.chat.completions.create(
    model="mixtral-8x7b-32768",
    messages=messages,
    stream=True
)

for chunk in stream:
    delta = chunk.choices[0].delta.content or ""
    print(delta, end="", flush=True)

# Example output: Why did the AI go to school? Because it had a lot of learning to do!
output
Why did the AI go to school? Because it had a lot of learning to do!

Troubleshooting

  • If you get authentication errors, verify your GROQ_API_KEY environment variable is set correctly.
  • If the model is not found, confirm you are using the exact model name mixtral-8x7b-32768 and the correct base_url.
  • For network issues, check your internet connection and firewall settings.

Key Takeaways

  • Use the OpenAI Python SDK with Groq's base_url to access Mixtral models.
  • Set your Groq API key in the environment variable GROQ_API_KEY for authentication.
  • Streaming responses enable token-by-token output for interactive applications.
Verified 2026-04 · mixtral-8x7b-32768
Verify ↗