How to use Composio with CrewAI
Quick answer
Use the
composio_openai package to obtain Composio tools and pass them to your OpenAI client from the openai SDK. Then send chat completions with tools= parameter to enable CrewAI tool calls within your AI chat flow.PREREQUISITES
Python 3.8+OpenAI API key (free tier works)Composio API keypip install openai>=1.0 composio-core composio-openai
Setup
Install the required Python packages and set environment variables for your API keys.
- Install packages:
openaiandcomposio-corewithcomposio-openai. - Set
OPENAI_API_KEYandCOMPOSIO_API_KEYin your environment.
pip install openai>=1.0 composio-core composio-openai
# Set environment variables in your shell
export OPENAI_API_KEY=os.environ["OPENAI_API_KEY"]
export COMPOSIO_API_KEY=os.environ["COMPOSIO_API_KEY"] output
Collecting openai Collecting composio-core Collecting composio-openai Successfully installed openai composio-core composio-openai-... # No output for export commands
Step by step
This example shows how to initialize the Composio toolset, retrieve tools for CrewAI actions, and use them with the OpenAI client to perform a tool call.
import os
from openai import OpenAI
from composio_openai import ComposioToolSet, Action
# Initialize Composio toolset with your Composio API key
toolset = ComposioToolSet(api_key=os.environ["COMPOSIO_API_KEY"])
# Get tools for CrewAI actions (e.g., GitHub star repository)
tools = toolset.get_tools(actions=[Action.GITHUB_STAR_A_REPOSITORY])
# Initialize OpenAI client with your OpenAI API key
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
# Compose chat messages
messages = [{"role": "user", "content": "Star the openai/openai-python repo"}]
# Send chat completion request with tools parameter
tool_response = client.chat.completions.create(
model="gpt-4o-mini",
tools=tools,
messages=messages
)
# Handle tool calls if any
if tool_response.choices[0].finish_reason == "tool_calls":
handled = toolset.handle_tool_calls(tool_response)
print("Tool call handled result:", handled)
else:
print("Response:", tool_response.choices[0].message.content) output
Tool call handled result: {'status': 'success', 'details': 'Repository starred successfully'} Common variations
You can use async calls with asyncio and the OpenAI async client. Also, switch models like gpt-4o or gpt-4o-mini depending on your latency and cost needs. For LangChain integration, use ComposioToolSet with LangChain chains.
import asyncio
import os
from openai import OpenAI
from composio_openai import ComposioToolSet, Action
async def async_main():
toolset = ComposioToolSet(api_key=os.environ["COMPOSIO_API_KEY"])
tools = toolset.get_tools(actions=[Action.GITHUB_STAR_A_REPOSITORY])
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
messages = [{"role": "user", "content": "Star the openai/openai-python repo"}]
response = await client.chat.completions.acreate(
model="gpt-4o",
tools=tools,
messages=messages
)
if response.choices[0].finish_reason == "tool_calls":
result = toolset.handle_tool_calls(response)
print("Async tool call handled result:", result)
else:
print("Async response:", response.choices[0].message.content)
asyncio.run(async_main()) output
Async tool call handled result: {'status': 'success', 'details': 'Repository starred successfully'} Troubleshooting
- If you get authentication errors, verify your
OPENAI_API_KEYandCOMPOSIO_API_KEYenvironment variables are set correctly. - If no tools are returned, check that your Composio API key has access to CrewAI tool actions.
- For unexpected errors, enable verbose logging or check network connectivity.
Key Takeaways
- Use
ComposioToolSetto fetch CrewAI tools and pass them viatools=toOpenAI.chat.completions.create. - Always set
OPENAI_API_KEYandCOMPOSIO_API_KEYin your environment for authentication. - Handle tool calls by checking
finish_reason == 'tool_calls'and usetoolset.handle_tool_calls()to execute them. - Async usage is supported with
acreatefor scalable integration. - Switch models like
gpt-4o-miniorgpt-4obased on your performance and cost requirements.