How to use MCP with LangChain
Quick answer
Use the
mcp Python SDK to create an MCP Server that connects AI agents to tools, then integrate it with LangChain by implementing a custom chain or agent that communicates via MCP. This enables LangChain to leverage MCP's protocol for tool usage and resource access.PREREQUISITES
Python 3.8+pip install mcp langchainAnthropic API key (for MCP usage)OpenAI API key (if using OpenAI models in LangChain)
Setup
Install the required packages mcp and langchain via pip and set your environment variables for API keys.
pip install mcp langchain Step by step
This example shows how to create an MCP server using the mcp SDK and integrate it with LangChain by defining a custom LangChain agent that communicates with the MCP server over stdio.
import os
from anthropic import Anthropic
from mcp.server.stdio import stdio_server
from mcp.server import Server
from langchain.schema import BaseMessage
from langchain.agents import AgentExecutor, Tool
from langchain.chains.base import Chain
# Step 1: Define MCP server handler
class MyMCPServer(Server):
def __init__(self):
super().__init__()
self.client = Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
async def handle_message(self, message):
# Example: forward message content to Anthropic Claude model
response = self.client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=512,
system="You are an assistant connected via MCP.",
messages=[{"role": "user", "content": message.content}]
)
return response.content[0].text
# Step 2: Run MCP server in background (for demo, run in main thread)
import threading
import asyncio
def run_mcp_server():
stdio_server(MyMCPServer())
thread = threading.Thread(target=run_mcp_server, daemon=True)
thread.start()
# Step 3: Define a LangChain Chain that sends queries to MCP server
class MCPChain(Chain):
def __init__(self):
super().__init__()
@property
def input_keys(self):
return ["input"]
@property
def output_keys(self):
return ["output"]
def _call(self, inputs):
# For demo, simulate sending input to MCP server and getting response
# In production, implement IPC or network communication with MCP server
user_input = inputs["input"]
# Here we just echo input for demonstration
output = f"MCP server response to: {user_input}"
return {"output": output}
# Step 4: Use MCPChain in LangChain
mcp_chain = MCPChain()
result = mcp_chain.invoke({"input": "Hello from LangChain to MCP!"})
print(result["output"]) output
MCP server response to: Hello from LangChain to MCP!
Common variations
- Use async MCP server with
asynciofor better concurrency. - Integrate MCP with LangChain agents by wrapping MCP calls inside
Toolobjects. - Use different models like
gpt-4oorclaude-sonnet-4-5in MCP server handler. - Communicate with MCP server over TCP or HTTP instead of stdio for distributed setups.
Troubleshooting
- If MCP server does not start, check that
ANTHROPIC_API_KEYis set correctly. - For IPC communication errors, verify the transport method (stdio, TCP) matches between MCP server and LangChain client.
- If LangChain does not receive responses, ensure the MCP server's
handle_messagemethod returns valid strings.
Key Takeaways
- Use the official
mcpPython SDK to create MCP servers that connect AI agents to tools. - Integrate MCP with LangChain by implementing custom chains or agents that communicate with MCP servers.
- Run MCP servers over stdio or network transports for flexible deployment.
- Use Anthropic models like
claude-3-5-sonnet-20241022inside MCP handlers for powerful AI responses. - Ensure environment variables for API keys are set to avoid authentication errors.