Concept beginner · 3 min read

What is function calling in LLMs

Quick answer
Function calling in LLMs is a feature that allows language models to output structured data representing function calls, enabling direct interaction with APIs or code. It helps convert natural language requests into precise, machine-readable commands for automation and integration.
Function calling in LLMs is a capability that lets language models generate structured function call outputs to interact programmatically with APIs or software.

How it works

Function calling in large language models (LLMs) works by enabling the model to produce structured outputs that represent calls to predefined functions or APIs. Instead of just generating plain text, the model outputs JSON-like objects specifying the function name and arguments. This is similar to a human writing a command in a programming language based on a natural language request. The LLM acts as a translator from natural language to structured commands, allowing seamless integration with external systems.

Concrete example

Here is an example using the OpenAI Python SDK to demonstrate function calling with the gpt-4o model. The model is instructed to call a function named get_weather with parameters for location and date:

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

functions = [
    {
        "name": "get_weather",
        "description": "Get the weather forecast for a location",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {"type": "string", "description": "City name"},
                "date": {"type": "string", "description": "Date in YYYY-MM-DD format"}
            },
            "required": ["location", "date"]
        }
    }
]

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "What is the weather in New York tomorrow?"}],
    functions=functions,
    function_call="auto"
)

print(response.choices[0].message.function_call)
output
{'name': 'get_weather', 'arguments': '{"location": "New York", "date": "2026-04-27"}'}

When to use it

Use function calling when you want your LLM to interact directly with APIs, databases, or backend functions based on user input. It is ideal for building chatbots that perform actions like booking, querying data, or controlling devices. Avoid it when you only need free-form text generation without structured outputs or when the function schema is too complex for the model to reliably generate.

Key terms

TermDefinition
Function callingLLM feature to output structured function call data.
Function schemaJSON schema defining function name, parameters, and types.
Function call objectThe structured output specifying which function to call and with what arguments.
API integrationConnecting LLM outputs to external APIs or services.

Key Takeaways

  • Function calling lets LLMs produce structured commands for direct API or code interaction.
  • Define clear function schemas to guide the model's output format and parameters.
  • Use function calling to automate tasks and integrate LLMs with backend systems.
  • Avoid function calling when only natural language text output is needed.
  • Current models like gpt-4o support robust function calling capabilities.
Verified 2026-04 · gpt-4o
Verify ↗