How to use Mistral for AI agents
Quick answer
Use the
mistralai Python SDK or the OpenAI-compatible API to integrate Mistral models for AI agents. Instantiate the client with your API key, then call chat.complete with your chosen model and messages to get AI-generated responses.PREREQUISITES
Python 3.8+MISTRAL_API_KEY environment variable setpip install mistralai or pip install openai>=1.0
Setup
Install the official mistralai Python SDK or use the OpenAI-compatible openai package with the Mistral API base URL. Set your API key in the MISTRAL_API_KEY environment variable.
pip install mistralai Step by step
This example shows how to create a simple AI agent using the mistralai SDK to send a chat completion request to the mistral-large-latest model.
import os
from mistralai import Mistral
client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])
response = client.chat.complete(
model="mistral-large-latest",
messages=[{"role": "user", "content": "Explain the concept of AI agents."}]
)
print(response.choices[0].message.content) output
AI agents are autonomous programs that perceive their environment, make decisions, and perform actions to achieve specific goals.
Common variations
You can also use the OpenAI-compatible openai Python SDK by setting the base_url to Mistral's API endpoint. For streaming responses, use the stream=True parameter. Different models like mistral-small-latest can be specified for lighter workloads.
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")
response = client.chat.completions.create(
model="mistral-large-latest",
messages=[{"role": "user", "content": "What are AI agents?"}],
stream=False
)
print(response.choices[0].message.content) output
AI agents are software entities that perceive their environment and act autonomously to accomplish tasks or goals.
Troubleshooting
- If you get authentication errors, verify your
MISTRAL_API_KEYis correctly set in your environment. - For network timeouts, check your internet connection and retry.
- If the model name is invalid, confirm you are using a current Mistral model like
mistral-large-latest.
Key Takeaways
- Use the official
mistralaiSDK or OpenAI-compatibleopenaiclient with Mistral's API base URL. - Set your API key securely in the
MISTRAL_API_KEYenvironment variable before making requests. - Specify the model like
mistral-large-latestand pass messages tochat.completeorchat.completions.create. - Streaming and async calls are supported via the OpenAI-compatible SDK with
stream=True. - Check model names and API keys carefully to avoid common errors.