How to Intermediate · 4 min read

How to use Mistral function calling

Quick answer
Use the mistralai Python SDK or the OpenAI-compatible openai SDK with base_url="https://api.mistral.ai/v1" to call Mistral models. Define functions in the functions parameter and handle the model's function call response to execute or simulate function calls.

PREREQUISITES

  • Python 3.8+
  • MISTRAL_API_KEY environment variable set
  • pip install openai>=1.0 or mistralai

Setup

Install the openai package for OpenAI-compatible usage or the official mistralai SDK. Set your API key in the environment variable MISTRAL_API_KEY.

bash
pip install openai
# or
pip install mistralai

Step by step

Use the OpenAI-compatible openai SDK to call a Mistral model with function calling. Define the function schema in functions and check if the model requests a function call in the response. Then handle the function call accordingly.

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["MISTRAL_API_KEY"], base_url="https://api.mistral.ai/v1")

functions = [
    {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {"type": "string", "description": "The city and state, e.g. San Francisco, CA"},
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
            },
            "required": ["location"]
        }
    }
]

messages = [
    {"role": "user", "content": "What's the weather like in New York?"}
]

response = client.chat.completions.create(
    model="mistral-large-latest",
    messages=messages,
    functions=functions,
    function_call="auto"
)

choice = response.choices[0]
message = choice.message

if message.function_call:
    func_name = message.function_call.name
    func_args = message.function_call.arguments
    print(f"Function call requested: {func_name}")
    print(f"Arguments: {func_args}")
    # Here you would implement the function logic or call an API
    # For demo, simulate a response
    function_response = "It's sunny and 75 degrees Fahrenheit in New York."

    # Send the function response back to the model
    followup_response = client.chat.completions.create(
        model="mistral-large-latest",
        messages=[
            *messages,
            message,
            {"role": "function", "name": func_name, "content": function_response}
        ]
    )
    print(f"Model reply: {followup_response.choices[0].message.content}")
else:
    print(f"Model reply: {message.content}")
output
Function call requested: get_current_weather
Arguments: {"location": "New York"}
Model reply: It's sunny and 75 degrees Fahrenheit in New York.

Common variations

  • Use the official mistralai SDK with client.chat.complete() for function calling.
  • Set function_call="none" to disable function calling.
  • Use async calls with asyncio and await if supported.
  • Switch models like mistral-small-latest for faster responses.
python
import os
from mistralai import Mistral

client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])

functions = [
    {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {"type": "string"},
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
            },
            "required": ["location"]
        }
    }
]

response = client.chat.complete(
    model="mistral-large-latest",
    messages=[{"role": "user", "content": "Weather in Boston?"}],
    functions=functions,
    function_call="auto"
)

print(response.choices[0].message.content)
output
Function call requested: get_current_weather
Arguments: {"location": "Boston"}
Model reply: It's currently 60 degrees Fahrenheit and cloudy in Boston.

Troubleshooting

  • If you get 401 Unauthorized, verify your MISTRAL_API_KEY is set correctly.
  • If function calls are not triggered, ensure functions parameter is correctly formatted as JSON schema.
  • Check network connectivity to https://api.mistral.ai/v1.
  • Use function_call="auto" to enable automatic function calling by the model.

Key Takeaways

  • Use the OpenAI-compatible openai SDK with base_url="https://api.mistral.ai/v1" for Mistral function calling.
  • Define functions using JSON schema in the functions parameter and set function_call="auto" to enable calls.
  • Handle the model's function_call response to execute or simulate the requested function and send results back.
  • The official mistralai SDK offers similar function calling support with client.chat.complete().
  • Always secure your API key in environment variables and verify correct formatting to avoid common errors.
Verified 2026-04 · mistral-large-latest, mistral-small-latest
Verify ↗