How to beginner · 3 min read

How to use Mistral API

Quick answer
Use the openai Python SDK with base_url="https://api.mistral.ai/v1" and your MISTRAL_API_KEY to call client.chat.completions.create() with models like mistral-large-latest. This OpenAI-compatible pattern enables easy integration for chat completions and text generation.

PREREQUISITES

  • Python 3.8+
  • MISTRAL_API_KEY environment variable set
  • pip install openai>=1.0

Setup

Install the openai Python package (version 1.0 or higher) and set your MISTRAL_API_KEY as an environment variable for authentication.

bash
pip install openai>=1.0

Step by step

This example shows how to create a chat completion request to the Mistral API using the OpenAI-compatible SDK pattern.

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")

response = client.chat.completions.create(
    model="mistral-large-latest",
    messages=[{"role": "user", "content": "Hello, how can I use the Mistral API?"}]
)

print(response.choices[0].message.content)
output
Hello! You can use the Mistral API by sending chat completion requests with your API key and specifying the model you want to use.

Common variations

  • Use mistral-small-latest for a smaller, faster model.
  • For streaming responses, use the stream=True parameter in chat.completions.create().
  • Alternatively, use the official mistralai SDK for more features.
python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")

# Streaming example
response = client.chat.completions.create(
    model="mistral-large-latest",
    messages=[{"role": "user", "content": "Stream a response for me."}],
    stream=True
)

for chunk in response:
    print(chunk.choices[0].delta.get("content", ""), end="")
output
Streaming response text printed chunk by chunk...

Troubleshooting

  • If you get authentication errors, verify your MISTRAL_API_KEY environment variable is set correctly.
  • For network errors, check your internet connection and the base_url endpoint.
  • If the model is not found, confirm you are using a valid model name like mistral-large-latest.

Key Takeaways

  • Use the OpenAI-compatible openai SDK with base_url="https://api.mistral.ai/v1" to access Mistral models.
  • Set your API key in the MISTRAL_API_KEY environment variable for secure authentication.
  • You can stream responses by passing stream=True to chat.completions.create().
  • Mistral offers multiple models like mistral-large-latest and mistral-small-latest for different use cases.
  • Check your environment variables and model names carefully to avoid common errors.
Verified 2026-04 · mistral-large-latest, mistral-small-latest
Verify ↗