Debug Fix intermediate · 3 min read

Topic Group: Common Migration Errors

Quick answer
Common LangChain migration errors arise from outdated imports, deprecated client usage, and incorrect model names. Use the latest imports from langchain_openai and langchain_community, and ensure you call the OpenAI or Anthropic clients with the current SDK patterns and model names like gpt-4o or claude-3-5-sonnet-20241022.
ERROR TYPE code_error
⚡ QUICK FIX
Update your imports to use langchain_openai and langchain_community and switch to the new client instantiation and method calls as per SDK v1+.

Why this happens

LangChain has updated its SDK and import paths, deprecating older modules like langchain.llms and langchain.chat_models. Using these deprecated imports or old client patterns causes import errors or silent failures. Additionally, calling OpenAI or Anthropic APIs with outdated methods like openai.ChatCompletion.create() or missing the system= parameter for Claude leads to runtime errors or unexpected behavior.

Example broken code:

python
from langchain.llms import OpenAI
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
response = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)
output
ModuleNotFoundError: No module named 'langchain.llms'

The fix

Use the updated imports from langchain_openai and langchain_community. Instantiate the client directly from the provider SDK (e.g., OpenAI or anthropic.Anthropic) with api_key from os.environ. Use current model names like gpt-4o or claude-3-5-sonnet-20241022. For Claude, use the system= parameter instead of a system message in the messages array.

This works because it aligns with the latest SDK v1+ patterns and avoids deprecated modules and methods.

python
from openai import OpenAI
import os

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)

import anthropic
client_anthropic = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
message = client_anthropic.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    system="You are a helpful assistant.",
    messages=[{"role": "user", "content": "Hello"}]
)
print(message.content[0].text)
output
Hello
Hello

Preventing it in production

Implement automated tests to verify your LangChain imports and client calls after upgrades. Use dependency management tools to lock versions and monitor changelogs for breaking changes. Add retry logic for transient API errors and validate model names and parameters before runtime. Consider using CI pipelines to catch deprecated usage early.

Key Takeaways

  • Always update LangChain imports to langchain_openai and langchain_community to avoid import errors.
  • Use the latest OpenAI and Anthropic SDK client patterns with correct model names like gpt-4o and claude-3-5-sonnet-20241022.
  • Validate and test your AI integration code after upgrades to catch deprecated usage early.
Verified 2026-04 · gpt-4o, claude-3-5-sonnet-20241022
Verify ↗