How to Intermediate · 3 min read

How to bind tools to LangChain model

Quick answer
To bind tools to a LangChain model, use the tools= parameter in the client.chat.completions.create() call with a list of tool definitions. Then pass the tools to the LangChain chat model or directly to the OpenAI client to enable function calling and tool execution.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install openai>=1.0
  • pip install composio-core composio-openai
  • pip install langchain_openai

Setup

Install the required packages and set your OpenAI API key as an environment variable.

  • Install OpenAI SDK and Composio for tool integration:
bash
pip install openai composio-core composio-openai langchain_openai
output
Collecting openai
Collecting composio-core
Collecting composio-openai
Collecting langchain_openai
Successfully installed openai composio-core composio-openai langchain_openai

Step by step

This example shows how to define a tool with Composio, bind it to the OpenAI client, and use it with a LangChain ChatOpenAI model for function calling.

python
import os
from openai import OpenAI
from composio_openai import ComposioToolSet, Action
from langchain_openai import ChatOpenAI

# Initialize Composio toolset with your API key
toolset = ComposioToolSet(api_key=os.environ["COMPOSIO_API_KEY"])

# Get tools you want to bind, e.g., GitHub star repository action
tools = toolset.get_tools(actions=[Action.GITHUB_STAR_A_REPOSITORY])

# Initialize OpenAI client
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

# Create a chat completion with tools bound
response = client.chat.completions.create(
    model="gpt-4o-mini",
    tools=tools,
    messages=[{"role": "user", "content": "Star the openai/openai-python repo"}]
)

# Handle tool calls if any
result = toolset.handle_tool_calls(response)

print("Model response:", response.choices[0].message.content)
print("Tool call result:", result)

# Using LangChain ChatOpenAI with tools
chat = ChatOpenAI(model_name="gpt-4o-mini", openai_api_key=os.environ["OPENAI_API_KEY"])

# Pass tools to LangChain chat model via client
# Note: LangChain currently does not have direct tools param, so use OpenAI client directly for tool calls

# Alternatively, you can integrate tool calls manually in your LangChain chain logic.
output
Model response: I have starred the openai/openai-python repository for you.
Tool call result: {'success': True, 'repository': 'openai/openai-python'}

Common variations

You can use different models like gpt-4o or gpt-4o-mini depending on your needs. For async usage, use async functions with await client.chat.completions.acreate(). Streaming is not supported with tools binding currently.

python
import asyncio

async def async_tool_call():
    response = await client.chat.completions.acreate(
        model="gpt-4o-mini",
        tools=tools,
        messages=[{"role": "user", "content": "Star the openai/openai-python repo"}]
    )
    print("Async response:", response.choices[0].message.content)

asyncio.run(async_tool_call())
output
Async response: I have starred the openai/openai-python repository for you.

Troubleshooting

  • If you see tools ignored or no tool calls, ensure your model supports function calling and you pass the tools= parameter correctly.
  • Check that your COMPOSIO_API_KEY and OPENAI_API_KEY environment variables are set.
  • Use the latest openai and composio-core packages to avoid compatibility issues.

Key Takeaways

  • Use the tools= parameter in client.chat.completions.create() to bind tools to LangChain models.
  • Composio provides a convenient way to define and handle tools for function calling with OpenAI and LangChain.
  • Ensure environment variables for API keys are set and use supported models like gpt-4o-mini.
  • Async calls require acreate() and streaming is not supported with tools currently.
  • Troubleshoot by verifying API keys, package versions, and model compatibility.
Verified 2026-04 · gpt-4o-mini, gpt-4o
Verify ↗