How to beginner · 3 min read

How to use DuckDuckGo search with LangChain agent

Quick answer
Use LangChain's DuckDuckGoSearchResults tool combined with a ZeroShotAgent or AgentExecutor to enable web search capabilities. Instantiate the search tool, create an agent with an LLM like ChatOpenAI, and run queries that trigger DuckDuckGo searches within the agent's reasoning.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langchain openai duckduckgo-search

Setup

Install required packages and set your OpenAI API key as an environment variable.

bash
pip install langchain openai duckduckgo-search

# In your shell:
# export OPENAI_API_KEY=os.environ["OPENAI_API_KEY"]

Step by step

This example shows how to create a LangChain agent that uses DuckDuckGo search to answer questions dynamically.

python
import os
from langchain_openai import ChatOpenAI
from langchain.agents import initialize_agent, Tool
from langchain.tools import DuckDuckGoSearchResults

# Initialize the LLM
llm = ChatOpenAI(model="gpt-4o", temperature=0)

# Create the DuckDuckGo search tool
search_tool = DuckDuckGoSearchResults()

# Wrap the search tool as a LangChain Tool
tools = [
    Tool(
        name="DuckDuckGo Search",
        func=search_tool.run,
        description="Useful for answering questions by searching the web using DuckDuckGo."
    )
]

# Initialize the agent with the LLM and tools
agent = initialize_agent(
    tools,
    llm,
    agent="zero-shot-react-description",
    verbose=True
)

# Run a query that triggers DuckDuckGo search
query = "Who won the Best Picture Oscar in 2025?"
result = agent.run(query)
print(result)
output
Best Picture Oscar 2025 was awarded to [Answer from DuckDuckGo search results].

Common variations

  • Use ChatAnthropic or Gemini models instead of OpenAI's ChatOpenAI.
  • Enable streaming output by setting streaming=True in the LLM constructor.
  • Use async versions of LangChain agents for concurrency.
  • Customize the search tool with parameters like max_results or timeout.

Troubleshooting

  • If you get empty or no results, check your internet connection and DuckDuckGo API availability.
  • Ensure your OpenAI API key is set correctly in os.environ["OPENAI_API_KEY"].
  • Verbose mode helps debug agent decision-making steps.
  • Update packages regularly to avoid compatibility issues.

Key Takeaways

  • Use LangChain's built-in DuckDuckGoSearchResults tool to integrate web search.
  • Combine the search tool with an LLM agent like ZeroShotAgent for dynamic query answering.
  • Set environment variables for API keys and install dependencies before running the agent.
  • Verbose mode helps understand how the agent uses search to answer questions.
  • You can swap models or enable streaming for different use cases.
Verified 2026-04 · gpt-4o, ChatOpenAI
Verify ↗