How to beginner · 3 min read

How to use JsonOutputParser in LangChain

Quick answer
Use JsonOutputParser in LangChain to automatically parse AI model responses into structured JSON objects by defining a JSON schema and integrating it with a prompt template. This enables reliable extraction of structured data from natural language outputs.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langchain openai

Setup

Install the required packages and set your OpenAI API key as an environment variable.

bash
pip install langchain openai

Step by step

This example shows how to use JsonOutputParser with LangChain to parse a model's response into a structured JSON object. We define a JSON schema, create a prompt template with the parser, call the OpenAI gpt-4o-mini model, and parse the output.

python
import os
from langchain_openai import ChatOpenAI
from langchain_core.output_parsers import JsonOutputParser
from langchain_core.prompts import ChatPromptTemplate, HumanMessagePromptTemplate

# Define the JSON schema for the expected output
json_schema = {
    "type": "object",
    "properties": {
        "name": {"type": "string"},
        "age": {"type": "integer"},
        "email": {"type": "string", "format": "email"}
    },
    "required": ["name", "age", "email"]
}

# Initialize the JsonOutputParser with the schema
parser = JsonOutputParser(schema=json_schema)

# Create a prompt template that instructs the model to output JSON
prompt_template = ChatPromptTemplate.from_messages([
    HumanMessagePromptTemplate.from_template(
        "Provide user info in JSON format matching this schema:\n{schema}\nUser: {user_input}"
    )
])

# Format the prompt with the JSON schema and user input
prompt = prompt_template.format_prompt(schema=json_schema, user_input="Tell me about Alice.")

# Initialize the OpenAI client
client = ChatOpenAI(model="gpt-4o-mini", temperature=0, api_key=os.environ["OPENAI_API_KEY"])

# Call the model
response = client(prompt.to_messages())

# Parse the model's JSON output
parsed_output = parser.parse(response.content)

print("Raw model output:", response.content)
print("Parsed JSON output:", parsed_output)
output
Raw model output: {"name": "Alice", "age": 30, "email": "alice@example.com"}
Parsed JSON output: {'name': 'Alice', 'age': 30, 'email': 'alice@example.com'}

Common variations

  • Use JsonOutputParser with other LLMs supported by LangChain by changing the model parameter.
  • Combine JsonOutputParser with LangChain chains for complex workflows.
  • Use async calls with LangChain's async clients if needed.

Troubleshooting

  • If parsing fails due to invalid JSON, ensure the model is instructed clearly to output strict JSON matching the schema.
  • Use temperature=0 to reduce randomness and improve JSON output consistency.
  • Validate your JSON schema for correctness to avoid parser errors.

Key Takeaways

  • Use JsonOutputParser to convert AI text outputs into structured JSON automatically.
  • Define a clear JSON schema to guide the model's output format.
  • Set temperature=0 in the model call to improve JSON output reliability.
Verified 2026-04 · gpt-4o-mini
Verify ↗