How to beginner · 3 min read

How to use prompt template in LangChain

Quick answer
Use ChatPromptTemplate from langchain_core.prompts to define reusable prompt templates with variables. Fill these templates dynamically by calling format with your input values, then pass the formatted prompt to a LangChain chat model like ChatOpenAI.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langchain_openai>=0.2.0 langchain_core>=0.2.0

Setup

Install the required LangChain packages and set your OpenAI API key as an environment variable.

bash
pip install langchain_openai langchain_core

Step by step

Create a ChatPromptTemplate with input variables, format it with values, and send the prompt to a ChatOpenAI model to get a response.

python
import os
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate

# Initialize the chat model
chat = ChatOpenAI(model_name="gpt-4o", temperature=0, openai_api_key=os.environ["OPENAI_API_KEY"])

# Define a prompt template with variables
prompt_template = ChatPromptTemplate.from_template(
    "Translate the following English text to French: {text}"
)

# Format the prompt with input
formatted_prompt = prompt_template.format(text="Hello, how are you?")

# Send the prompt to the model
response = chat(formatted_prompt)

print(response.content)
output
Bonjour, comment ça va ?

Common variations

  • Use multiple input variables in the template, e.g., "Summarize the text: {text} in {language}".
  • Use other chat models like claude-3-5-sonnet-20241022 with Anthropic SDK.
  • Use async calls with LangChain's async client methods.
python
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
import asyncio

async def async_example():
    chat = ChatOpenAI(model_name="gpt-4o", temperature=0, openai_api_key=os.environ["OPENAI_API_KEY"])
    prompt = ChatPromptTemplate.from_template(
        "Summarize the following text in {language}: {text}"
    )
    formatted = prompt.format(text="LangChain simplifies prompt management.", language="Spanish")
    response = await chat.acall(formatted)
    print(response.content)

asyncio.run(async_example())
output
LangChain simplifica la gestión de prompts.

Troubleshooting

  • If you get a KeyError on formatting, ensure all variables in the template are provided in format().
  • If the API key is missing or invalid, set OPENAI_API_KEY correctly in your environment.
  • For model errors, verify the model name is current and supported.

Key Takeaways

  • Use ChatPromptTemplate to create reusable, variable-driven prompts.
  • Always format the prompt with all required variables before sending to the model.
  • LangChain supports both synchronous and asynchronous usage patterns.
  • Keep your API keys secure and environment variables properly configured.
  • Verify model names regularly as they may update over time.
Verified 2026-04 · gpt-4o, claude-3-5-sonnet-20241022
Verify ↗