How to use Mistral API
Quick answer
Use the
openai Python SDK with base_url="https://api.mistral.ai/v1" and your MISTRAL_API_KEY to call client.chat.completions.create() with models like mistral-large-latest. This OpenAI-compatible pattern enables easy integration for chat completions and text generation.PREREQUISITES
Python 3.8+MISTRAL_API_KEY environment variable setpip install openai>=1.0
Setup
Install the openai Python package (version 1.0 or higher) and set your MISTRAL_API_KEY as an environment variable for authentication.
pip install openai>=1.0 Step by step
This example shows how to create a chat completion request to the Mistral API using the OpenAI-compatible SDK pattern.
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")
response = client.chat.completions.create(
model="mistral-large-latest",
messages=[{"role": "user", "content": "Hello, how can I use the Mistral API?"}]
)
print(response.choices[0].message.content) output
Hello! You can use the Mistral API by sending chat completion requests with your API key and specifying the model you want to use.
Common variations
- Use
mistral-small-latestfor a smaller, faster model. - For streaming responses, use the
stream=Trueparameter inchat.completions.create(). - Alternatively, use the official
mistralaiSDK for more features.
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")
# Streaming example
response = client.chat.completions.create(
model="mistral-large-latest",
messages=[{"role": "user", "content": "Stream a response for me."}],
stream=True
)
for chunk in response:
print(chunk.choices[0].delta.get("content", ""), end="") output
Streaming response text printed chunk by chunk...
Troubleshooting
- If you get authentication errors, verify your
MISTRAL_API_KEYenvironment variable is set correctly. - For network errors, check your internet connection and the
base_urlendpoint. - If the model is not found, confirm you are using a valid model name like
mistral-large-latest.
Key Takeaways
- Use the OpenAI-compatible
openaiSDK withbase_url="https://api.mistral.ai/v1"to access Mistral models. - Set your API key in the
MISTRAL_API_KEYenvironment variable for secure authentication. - You can stream responses by passing
stream=Truetochat.completions.create(). - Mistral offers multiple models like
mistral-large-latestandmistral-small-latestfor different use cases. - Check your environment variables and model names carefully to avoid common errors.