How to beginner · 3 min read

How to create plugins in Semantic Kernel

Quick answer
Create plugins in Semantic Kernel by defining custom skills as Python classes or functions and registering them with the Kernel. Use the add_native_skill method to expose your plugin's methods as callable skills within the kernel environment.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install semantic-kernel openai

Setup

Install the semantic-kernel Python package and set your OpenAI API key as an environment variable.

  • Run pip install semantic-kernel openai
  • Set environment variable OPENAI_API_KEY with your OpenAI API key
bash
pip install semantic-kernel openai

Step by step

Define a plugin as a Python class with methods representing skills, then register it with the Kernel using add_native_skill. Invoke the skill via the kernel's run method.

python
import os
import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

# Initialize kernel and add OpenAI chat service
kernel = sk.Kernel()
kernel.add_service(OpenAIChatCompletion(
    service_id="chat",
    api_key=os.environ["OPENAI_API_KEY"],
    ai_model_id="gpt-4o-mini"
))

# Define a plugin class with skills
class GreetingPlugin:
    def hello(self, name: str) -> str:
        return f"Hello, {name}! Welcome to Semantic Kernel plugins."

# Register the plugin as a native skill
plugin = GreetingPlugin()
kernel.add_native_skill(plugin, "GreetingPlugin")

# Run the skill via kernel
result = kernel.run("GreetingPlugin.hello", "Alice")
print(result)
output
Hello, Alice! Welcome to Semantic Kernel plugins.

Common variations

You can create async plugin methods and register them similarly. Also, use different AI models by changing ai_model_id in OpenAIChatCompletion. Plugins can be organized in multiple classes or modules for modularity.

python
import asyncio

class AsyncGreetingPlugin:
    async def hello_async(self, name: str) -> str:
        return f"Hello asynchronously, {name}!"

plugin_async = AsyncGreetingPlugin()
kernel.add_native_skill(plugin_async, "AsyncGreetingPlugin")

async def main():
    result = await kernel.run_async("AsyncGreetingPlugin.hello_async", "Bob")
    print(result)

asyncio.run(main())
output
Hello asynchronously, Bob!

Troubleshooting

  • If you get AttributeError when running a skill, verify the method name matches exactly and the plugin is registered.
  • If the AI model call fails, check your OPENAI_API_KEY environment variable and network connectivity.
  • For async methods, ensure you use run_async and await the result properly.

Key Takeaways

  • Define plugins as Python classes with methods representing skills.
  • Register plugins with Kernel.add_native_skill to expose them.
  • Invoke plugin skills via kernel.run or run_async for async.
  • Use OpenAIChatCompletion to connect AI models to your kernel.
  • Ensure environment variables and method names are correct to avoid errors.
Verified 2026-04 · gpt-4o-mini
Verify ↗