How to Intermediate · 3 min read

How to connect MCP tools to LangChain agent

Quick answer
Use the mcp Python SDK to create an MCP Server that exposes your tools, then connect it to a LangChain agent via the MCPAgent class. This setup enables LangChain agents to invoke MCP tools through the standardized Model Context Protocol.

PREREQUISITES

  • Python 3.8+
  • pip install mcp langchain langchain_openai
  • OpenAI API key set in environment variable OPENAI_API_KEY

Setup

Install the required packages and set your environment variables before integrating MCP tools with LangChain.

bash
pip install mcp langchain langchain_openai

Step by step

This example shows how to create an MCP Server exposing a simple tool, then connect it to a LangChain MCPAgent that uses OpenAI's gpt-4o model to interact with the tool.

python
import os
from mcp.server import Server
from mcp.server.stdio import stdio_server
from langchain.agents.mcp import MCPAgent
from langchain_openai import ChatOpenAI

# Define a simple MCP tool handler
class EchoTool:
    def run(self, input_text: str) -> str:
        return f"Echo: {input_text}"

# Start MCP server exposing the EchoTool
server = Server()
server.add_tool("echo", EchoTool())

# Run the MCP server in a background thread or process (for demo, run in main thread)
import threading
threading.Thread(target=stdio_server, args=(server,), daemon=True).start()

# Setup LangChain MCPAgent to connect to the MCP server
# MCPAgent expects the server to be accessible via stdio or SSE transport

# Initialize OpenAI chat model
chat_model = ChatOpenAI(model="gpt-4o", temperature=0, openai_api_key=os.environ["OPENAI_API_KEY"])

# Create MCPAgent instance
agent = MCPAgent.from_stdio(
    chat=chat_model,
    tool_names=["echo"]
)

# Use the agent to call the MCP tool
response = agent.invoke([{"role": "user", "content": "Hello MCP tool!"}])
print("Agent response:", response.content)
output
Agent response: Echo: Hello MCP tool!

Common variations

  • Use stdio_server for local stdio transport or SSE transport for remote MCP servers.
  • Switch to async by using async MCP server and LangChain async agent methods.
  • Use different OpenAI models like gpt-4.1 or Anthropic models with LangChain's ChatAnthropic.

Troubleshooting

  • If the agent cannot connect to the MCP server, verify the server is running and accessible via the chosen transport.
  • Ensure environment variables like OPENAI_API_KEY are set correctly.
  • Check that tool names registered in MCP server match those passed to MCPAgent.

Key Takeaways

  • Use the official mcp Python SDK to expose tools as MCP servers.
  • Connect MCP servers to LangChain agents via MCPAgent for seamless tool integration.
  • Run MCP servers with stdio or SSE transport for compatibility with LangChain.
  • Set environment variables securely and use current OpenAI models like gpt-4o.
  • Test connectivity and tool registration to avoid common integration issues.
Verified 2026-04 · gpt-4o
Verify ↗