How to Intermediate · 3 min read

How to use nested models with Instructor

Quick answer
Use Instructor with nested Pydantic models by defining inner models as fields in outer models. Pass the outer model as response_model to client.chat.completions.create() to get structured nested data from the AI response.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install openai>=1.0 instructor pydantic

Setup

Install the required packages and set your OpenAI API key as an environment variable.

bash
pip install openai instructor pydantic

Step by step

Define nested Pydantic models and use Instructor to parse AI chat completions into these nested models.

python
import os
from pydantic import BaseModel
from typing import List
from openai import OpenAI
import instructor

# Initialize OpenAI client
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

# Create Instructor client wrapping OpenAI
inst_client = instructor.from_openai(client)

# Define nested Pydantic models
class Address(BaseModel):
    street: str
    city: str
    zip_code: str

class User(BaseModel):
    name: str
    age: int
    address: Address
    hobbies: List[str]

# Prompt to extract nested data
prompt = "Extract user info: John, 30 years old, lives at 123 Main St, Springfield, 12345, hobbies: reading, hiking"

# Call chat completion with response_model
response = inst_client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": prompt}],
    response_model=User
)

# Access nested structured data
user = response
print(f"Name: {user.name}")
print(f"Age: {user.age}")
print(f"Street: {user.address.street}")
print(f"City: {user.address.city}")
print(f"Zip: {user.address.zip_code}")
print(f"Hobbies: {', '.join(user.hobbies)}")
output
Name: John
Age: 30
Street: 123 Main St
City: Springfield
Zip: 12345
Hobbies: reading, hiking

Common variations

You can use async calls with Instructor by awaiting chat.completions.create(). Also, switch models by changing the model parameter. Nested models can include optional fields and lists for flexible extraction.

python
import asyncio

async def async_example():
    response = await inst_client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": prompt}],
        response_model=User
    )
    user = response
    print(user)

asyncio.run(async_example())

Troubleshooting

  • If the AI response does not parse correctly, ensure your Pydantic models match the expected data structure.
  • Use clear, explicit prompts to guide the AI to output JSON matching your nested models.
  • Check your API key and environment variables if you get authentication errors.

Key Takeaways

  • Define nested Pydantic models to represent complex structured data for Instructor.
  • Pass the outermost model as response_model to parse nested AI responses automatically.
  • Use clear prompts to help the AI generate output matching your nested model schema.
  • You can use both synchronous and asynchronous calls with Instructor.
  • Always verify environment variables and model names to avoid runtime errors.
Verified 2026-04 · gpt-4o-mini
Verify ↗