How to intermediate · 4 min read

How to define tools for LangChain agent

Quick answer
To define tools for a LangChain agent, create Python functions or classes that implement specific tasks, then wrap them with Tool objects from langchain.agents. Pass these tools to the agent constructor to enable the agent to call them during execution.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langchain openai>=1.0

Setup

Install the required packages and set your OpenAI API key as an environment variable.

  • Install LangChain and OpenAI SDK: pip install langchain openai
  • Set environment variable in your shell: export OPENAI_API_KEY='your_api_key'
bash
pip install langchain openai

Step by step

Define Python functions as tools, wrap them with Tool, then create an agent with these tools and run it.

python
import os
from langchain.agents import Tool, initialize_agent
from langchain_openai import ChatOpenAI

# Define a simple tool function

def greet(name: str) -> str:
    return f"Hello, {name}!"

# Wrap the function as a LangChain Tool

greet_tool = Tool(
    name="Greet",
    func=greet,
    description="Greets a person by their name."
)

# Initialize the language model
llm = ChatOpenAI(model="gpt-4o", temperature=0, openai_api_key=os.environ["OPENAI_API_KEY"])

# Initialize the agent with the tool
agent = initialize_agent(
    tools=[greet_tool],
    llm=llm,
    agent="zero-shot-react-description",
    verbose=True
)

# Run the agent with an input that triggers the tool
response = agent.run("Say hello to Alice.")
print(response)
output
Hello, Alice!

Common variations

You can define async tools by making your tool functions async and using async agent runners. Also, use different models like gpt-4o-mini or Anthropic's claude-3-5-sonnet-20241022 with their respective SDKs. Tools can also be classes with a __call__ method.

python
import asyncio
from langchain.agents import Tool, initialize_agent_async
from langchain_openai import ChatOpenAI

async def async_greet(name: str) -> str:
    await asyncio.sleep(0.1)  # simulate async work
    return f"Hello asynchronously, {name}!"

async_greet_tool = Tool(
    name="AsyncGreet",
    func=async_greet,
    description="Asynchronously greets a person by their name."
)

async def main():
    llm = ChatOpenAI(model="gpt-4o-mini", temperature=0, openai_api_key=os.environ["OPENAI_API_KEY"])
    agent = initialize_agent_async(
        tools=[async_greet_tool],
        llm=llm,
        agent="zero-shot-react-description",
        verbose=True
    )
    response = await agent.arun("Say hello to Bob asynchronously.")
    print(response)

asyncio.run(main())
output
Hello asynchronously, Bob!

Troubleshooting

  • If the agent does not call your tool, ensure the tool's description clearly matches the prompt context.
  • For environment variable errors, verify OPENAI_API_KEY is set correctly.
  • If you get import errors, confirm you installed the latest langchain and openai packages.

Key Takeaways

  • Define tools as Python functions or classes and wrap them with Tool from langchain.agents.
  • Pass the list of tools to initialize_agent along with an LLM to create a functional LangChain agent.
  • Use clear, descriptive tool descriptions to help the agent decide when to invoke each tool.
  • Async tools require async functions and initialize_agent_async for proper integration.
  • Always set your API keys in environment variables and install the latest SDK versions.
Verified 2026-04 · gpt-4o, gpt-4o-mini, claude-3-5-sonnet-20241022
Verify ↗