How to beginner · 3 min read

Semantic Kernel process framework explained

Quick answer
The Semantic Kernel process framework enables developers to orchestrate AI skills and workflows by defining modular, reusable skills and functions that can be composed into processes. It uses a Kernel object to manage AI services and execute these processes with inputs, enabling structured, maintainable AI application development.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install semantic-kernel openai

Setup

Install the semantic-kernel Python package and set your OpenAI API key as an environment variable.

  • Run pip install semantic-kernel openai
  • Set environment variable OPENAI_API_KEY with your OpenAI API key
bash
pip install semantic-kernel openai

Step by step

This example shows how to create a Kernel, add an OpenAI chat completion service, define a simple AI skill as a function, and run a process that invokes this skill.

python
import os
import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

# Initialize the Kernel
kernel = sk.Kernel()

# Add OpenAI chat completion service
kernel.add_service(OpenAIChatCompletion(
    service_id="chat",
    api_key=os.environ["OPENAI_API_KEY"],
    ai_model_id="gpt-4o-mini"
))

# Define a simple skill function
@kernel.skill("SummarizeText")
def summarize_text(context: sk.SKContext) -> str:
    prompt = f"Summarize this text concisely:\n{context['input']}"
    result = kernel.run_async("chat", prompt).result()
    return result

# Run the process
input_text = "Semantic Kernel is a framework to orchestrate AI skills and workflows."
context = kernel.create_new_context()
context['input'] = input_text
summary = summarize_text(context)
print("Summary:", summary)
output
Summary: Semantic Kernel orchestrates AI skills and workflows efficiently.

Common variations

You can use asynchronous calls with await in async functions, switch AI models by changing ai_model_id, or define multiple skills and compose them into complex processes. Semantic Kernel supports chaining skills and integrating external data sources.

python
import asyncio

async def async_summarize():
    kernel = sk.Kernel()
    kernel.add_service(OpenAIChatCompletion(
        service_id="chat",
        api_key=os.environ["OPENAI_API_KEY"],
        ai_model_id="gpt-4o"
    ))

    @kernel.skill("Summarize")
    async def summarize_text_async(context: sk.SKContext) -> str:
        prompt = f"Summarize this text:\n{context['input']}"
        result = await kernel.run_async("chat", prompt)
        return result

    context = kernel.create_new_context()
    context['input'] = "Semantic Kernel enables modular AI workflows."
    summary = await summarize_text_async(context)
    print("Async summary:", summary)

asyncio.run(async_summarize())
output
Async summary: Semantic Kernel enables modular AI workflows.

Troubleshooting

  • If you see API key missing errors, ensure OPENAI_API_KEY is set in your environment.
  • If the model is not found, verify the ai_model_id matches a valid OpenAI model like gpt-4o-mini.
  • For timeout or network errors, check your internet connection and API quota.

Key Takeaways

  • Use Kernel to manage AI services and orchestrate skills.
  • Define reusable AI functions as skills for modular workflows.
  • Processes run by invoking skills with input contexts for structured AI tasks.
Verified 2026-04 · gpt-4o-mini, gpt-4o
Verify ↗