How to migrate LangChain ConversationChain to LCEL
Quick answer
To migrate from
ConversationChain to LCEL, replace the old chain initialization with LCEL constructs using the new prompt templates and chat models. Use langchain_core and langchain_openai imports, and adapt your message handling to the LCEL API patterns.PREREQUISITES
Python 3.8+OpenAI API key (free tier works)pip install langchain_openai langchain_core
Setup
Install the required LangChain packages and set your OpenAI API key in the environment.
pip install langchain_openai langchain_core Step by step
This example shows how to migrate a simple ConversationChain to LCEL using ChatOpenAI and ChatPromptTemplate.
import os
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
# Old ConversationChain style (for reference):
# from langchain.chains import ConversationChain
# llm = ChatOpenAI(model_name="gpt-4o", temperature=0)
# chain = ConversationChain(llm=llm)
# response = chain.run("Hello, how are you?")
# New LCEL style migration:
# Initialize the chat model
chat = ChatOpenAI(model_name="gpt-4o", temperature=0, openai_api_key=os.environ["OPENAI_API_KEY"])
# Define a prompt template
prompt = ChatPromptTemplate.from_messages([
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "{input}"}
])
# Compose the chain
from langchain_core.chains import LLMChain
chain = LLMChain(llm=chat, prompt=prompt)
# Run the chain
response = chain.invoke({"input": "Hello, how are you?"})
print(response["text"]) output
I'm doing well, thank you! How can I assist you today?
Common variations
You can use async calls with await chain.ainvoke() or switch to other models like gpt-4o-mini. For Anthropic models, use ChatAnthropic with similar prompt templates.
import asyncio
async def async_example():
response = await chain.ainvoke({"input": "Hello asynchronously!"})
print(response["text"])
asyncio.run(async_example()) output
Hello asynchronously! How can I help you today?
Troubleshooting
- If you see
ImportError, verify you installedlangchain_openaiandlangchain_core. - If the API key is missing, set
OPENAI_API_KEYin your environment. - For prompt errors, ensure your prompt template matches the new LCEL format.
Key Takeaways
- Use
ChatOpenAIandChatPromptTemplatefromlangchain_openaiandlangchain_coreto migrate to LCEL. - Replace
ConversationChainwithLLMChainand adapt your prompt structure accordingly. - Use
chain.invoke()for synchronous calls andchain.ainvoke()for async. - Always load your API key from
os.environto avoid hardcoding. - Check your prompt roles and formatting to match LCEL expectations.