How to use Mistral with OpenAI SDK
Quick answer
Use the
OpenAI Python SDK with the base_url set to Mistral's API endpoint and your Mistral API key from os.environ. Call client.chat.completions.create with the desired mistral-large-latest or other Mistral model and your chat messages.PREREQUISITES
Python 3.8+Mistral API keypip install openai>=1.0
Setup
Install the openai Python package version 1.0 or higher. Set your Mistral API key as an environment variable MISTRAL_API_KEY before running the code.
pip install openai>=1.0 Step by step
This example shows how to create a chat completion request to Mistral's mistral-large-latest model using the OpenAI SDK with the base_url override.
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")
response = client.chat.completions.create(
model="mistral-large-latest",
messages=[{"role": "user", "content": "Hello, how can I use Mistral with OpenAI SDK?"}]
)
print(response.choices[0].message.content) output
Hello! You can use the OpenAI SDK by setting the base_url to Mistral's API endpoint and specifying a Mistral model like "mistral-large-latest" in your chat completion requests.
Common variations
- Use different Mistral models such as
mistral-small-latestorcodestral-latestby changing themodelparameter. - For asynchronous calls, use Python's
asynciowith the OpenAI SDK's async methods. - Streaming responses are supported by passing
stream=Truetochat.completions.create.
import asyncio
from openai import OpenAI
async def async_chat():
client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")
response = await client.chat.completions.acreate(
model="mistral-large-latest",
messages=[{"role": "user", "content": "Stream a response from Mistral."}],
stream=True
)
async for chunk in response:
print(chunk.choices[0].delta.get("content", ""), end="")
asyncio.run(async_chat()) output
Streaming response text from Mistral appears here in real time...
Troubleshooting
- If you get authentication errors, verify your
MISTRAL_API_KEYenvironment variable is set correctly. - Ensure the
base_urlis exactlyhttps://api.mistral.ai/v1to avoid endpoint errors. - Check your network connectivity if requests time out.
Key Takeaways
- Set
base_urltohttps://api.mistral.ai/v1when using Mistral with OpenAI SDK. - Use environment variables for your Mistral API key to keep credentials secure.
- The OpenAI SDK supports async and streaming with Mistral models.
- Switch models by changing the
modelparameter in your requests.