Azure OpenAI with Azure API Management
Quick answer
Use
AzureOpenAI client from the openai Python package configured with your Azure API Management endpoint and deployment name. Set azure_endpoint and api_key environment variables, then call chat.completions.create with your model deployment to securely access Azure OpenAI through API Management.PREREQUISITES
Python 3.8+Azure OpenAI resource with deployed modelAzure API Management instance configured with OpenAI proxypip install openai>=1.0Environment variables for Azure API key and endpoint
Setup
Install the official openai Python package and set environment variables for your Azure API Management endpoint and API key. Ensure your Azure OpenAI model is deployed and accessible via API Management.
pip install openai>=1.0 output
Collecting openai Downloading openai-1.x.x-py3-none-any.whl (xx kB) Installing collected packages: openai Successfully installed openai-1.x.x
Step by step
This example demonstrates calling Azure OpenAI through Azure API Management using the AzureOpenAI client. Replace environment variables with your actual values.
import os
from openai import AzureOpenAI
client = AzureOpenAI(
api_key=os.environ["AZURE_OPENAI_API_KEY"],
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_version="2024-02-01"
)
response = client.chat.completions.create(
model=os.environ["AZURE_OPENAI_DEPLOYMENT"],
messages=[{"role": "user", "content": "Hello from Azure API Management!"}]
)
print("Response:", response.choices[0].message.content) output
Response: Hello from Azure API Management! How can I assist you today?
Common variations
- Use async calls with
asyncioandawait client.chat.completions.acreate(...). - Stream responses by adding
stream=Truetochat.completions.create. - Switch models by changing the
modelparameter to another deployment name.
import asyncio
from openai import AzureOpenAI
async def main():
client = AzureOpenAI(
api_key=os.environ["AZURE_OPENAI_API_KEY"],
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_version="2024-02-01"
)
stream = await client.chat.completions.acreate(
model=os.environ["AZURE_OPENAI_DEPLOYMENT"],
messages=[{"role": "user", "content": "Stream this response."}],
stream=True
)
async for chunk in stream:
print(chunk.choices[0].delta.content or "", end="", flush=True)
asyncio.run(main()) output
Streamed response text appears here in real time...
Troubleshooting
- If you get
401 Unauthorized, verify yourAZURE_OPENAI_API_KEYand that API Management is correctly configured. 404 Not Foundmeans the deployment name or endpoint URL is incorrect.- Timeouts may require checking network connectivity or increasing client timeout settings.
Key Takeaways
- Use
AzureOpenAIclient withazure_endpointandapi_keyfor API Management integration. - Set environment variables
AZURE_OPENAI_API_KEY,AZURE_OPENAI_ENDPOINT, andAZURE_OPENAI_DEPLOYMENTfor secure access. - Support async and streaming calls by using
acreateandstream=Trueparameters. - Common errors relate to authentication and deployment naming; verify these first.
- Azure API Management enables centralized control and security for Azure OpenAI usage.