How to beginner · 3 min read

How to set up Semantic Kernel with Azure OpenAI

Quick answer
Use the semantic_kernel Python package to create a Kernel instance and add an AzureOpenAIChatCompletion service configured with your Azure endpoint, API key, and deployment name. This enables seamless integration of Semantic Kernel with Azure OpenAI models for chat completions.

PREREQUISITES

  • Python 3.8+
  • Azure OpenAI resource with deployment name
  • pip install semantic-kernel openai>=1.0
  • Set environment variables AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_DEPLOYMENT

Setup

Install the required Python packages and set environment variables for your Azure OpenAI resource.

  • Install Semantic Kernel and OpenAI SDK:
bash
pip install semantic-kernel openai>=1.0

Step by step

Use the following Python code to initialize Semantic Kernel with AzureOpenAIChatCompletion. Replace environment variables with your Azure OpenAI details.

python
import os
import semantic_kernel as sk
from semantic_kernel.connectors.ai.azure_open_ai import AzureOpenAIChatCompletion

# Ensure environment variables are set:
# AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_DEPLOYMENT

kernel = sk.Kernel()

kernel.add_service(AzureOpenAIChatCompletion(
    service_id="chat",
    api_key=os.environ["AZURE_OPENAI_API_KEY"],
    azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
    deployment_name=os.environ["AZURE_OPENAI_DEPLOYMENT"],
    api_version="2024-02-01"
))

# Example chat completion
response = kernel.chat.complete(
    service_id="chat",
    messages=[{"role": "user", "content": "Hello from Semantic Kernel with Azure OpenAI!"}]
)

print("Response:", response.text)
output
Response: Hello from Semantic Kernel with Azure OpenAI! How can I assist you today?

Common variations

You can customize the api_version parameter or use different Azure OpenAI deployments for other models. For asynchronous usage, Semantic Kernel supports async methods with await. You can also add multiple AI services to the kernel for hybrid scenarios.

python
import asyncio

async def async_chat():
    kernel = sk.Kernel()
    kernel.add_service(AzureOpenAIChatCompletion(
        service_id="chat",
        api_key=os.environ["AZURE_OPENAI_API_KEY"],
        azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
        deployment_name=os.environ["AZURE_OPENAI_DEPLOYMENT"],
        api_version="2024-02-01"
    ))

    response = await kernel.chat.complete_async(
        service_id="chat",
        messages=[{"role": "user", "content": "Async hello!"}]
    )
    print("Async response:", response.text)

asyncio.run(async_chat())
output
Async response: Hello! How can I help you asynchronously today?

Troubleshooting

  • If you get authentication errors, verify your AZURE_OPENAI_API_KEY and AZURE_OPENAI_ENDPOINT environment variables are correct.
  • Ensure your deployment name matches exactly the one configured in Azure.
  • Check your network connectivity to the Azure endpoint.
  • Use api_version="2024-02-01" or the latest supported version.

Key Takeaways

  • Use AzureOpenAIChatCompletion to integrate Semantic Kernel with Azure OpenAI easily.
  • Set environment variables for API key, endpoint, and deployment to avoid hardcoding secrets.
  • Semantic Kernel supports both synchronous and asynchronous chat completions with Azure OpenAI.
  • Always verify your Azure deployment name and API version to prevent errors.
  • You can add multiple AI services to Semantic Kernel for flexible AI workflows.
Verified 2026-04 · gpt-4o, AzureOpenAIChatCompletion
Verify ↗