How to build a file reading agent with OpenAI
Quick answer
Use the
OpenAI Python SDK to build a file reading agent by loading file content, sending it as a prompt to a model like gpt-4o, and processing the response. This approach enables you to query or summarize file contents dynamically with client.chat.completions.create.PREREQUISITES
Python 3.8+OpenAI API key (free tier works)pip install openai>=1.0
Setup
Install the OpenAI Python SDK and set your API key as an environment variable for secure access.
pip install openai>=1.0 Step by step
This example reads a text file, sends its content to the gpt-4o model, and prints a summary generated by the agent.
import os
from openai import OpenAI
# Initialize OpenAI client
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
# Read file content
file_path = "example.txt"
with open(file_path, "r", encoding="utf-8") as f:
file_content = f.read()
# Prepare prompt to summarize file content
prompt = f"Summarize the following text:\n\n{file_content}"
# Call OpenAI chat completion
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": prompt}]
)
# Output the summary
print("Summary of file content:")
print(response.choices[0].message.content) output
Summary of file content: [Model-generated summary of the file's text]
Common variations
- Use
asynccalls withasynciofor non-blocking file reading and API requests. - Switch models to
gpt-4o-minifor faster, cheaper responses. - Implement streaming responses to process large file summaries incrementally.
import os
import asyncio
import aiofiles
from openai import OpenAI
async def async_file_reading_agent():
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
file_path = "example.txt"
async with aiofiles.open(file_path, mode='r', encoding='utf-8') as f:
file_content = await f.read()
prompt = f"Summarize the following text:\n\n{file_content}"
response = await client.chat.completions.acreate(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}]
)
print("Async summary:")
print(response.choices[0].message.content)
# asyncio.run(async_file_reading_agent()) output
Async summary: [Model-generated summary]
Troubleshooting
- If you get a
FileNotFoundError, verify the file path is correct and accessible. - If the API returns an error, check your
OPENAI_API_KEYenvironment variable and model name. - For large files, consider chunking the content before sending to avoid token limits.
Key Takeaways
- Use the OpenAI Python SDK with environment variables for secure API access.
- Read file content and send it as a prompt to
gpt-4ofor summarization or querying. - Adapt the agent with async calls or smaller models for efficiency and cost control.