How to Intermediate · 3 min read

How to add human in the loop with LangGraph

Quick answer
To add human in the loop with LangGraph, design your agent workflow to pause and request human input at key decision points using LangGraph's interactive nodes or callbacks. This lets you combine AI automation with human judgment seamlessly within the LangGraph agent orchestration.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install langgraph openai

Setup

Install LangGraph and set your OpenAI API key as an environment variable to enable LLM calls.

bash
pip install langgraph openai

Step by step

This example shows a simple LangGraph agent that queries an LLM, then pauses to ask a human for confirmation before continuing.

python
import os
from langgraph import LangGraph, Node

# Set your OpenAI API key in environment variable
# export OPENAI_API_KEY=os.environ["ANTHROPIC_API_KEY"]

# Define a node that calls the LLM
class LLMNode(Node):
    def run(self, inputs):
        from openai import OpenAI
        client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
        prompt = inputs.get("prompt", "")
        response = client.chat.completions.create(
            model="gpt-4o",
            messages=[{"role": "user", "content": prompt}]
        )
        return {"llm_response": response.choices[0].message.content}

# Define a node that simulates human in the loop
class HumanInLoopNode(Node):
    def run(self, inputs):
        print("LLM response:", inputs.get("llm_response"))
        human_input = input("Please confirm or modify the response (type your input): ")
        return {"human_confirmed_response": human_input}

# Build the LangGraph workflow
class HumanInLoopGraph(LangGraph):
    def __init__(self):
        super().__init__()
        self.llm_node = LLMNode(name="LLMNode")
        self.human_node = HumanInLoopNode(name="HumanInLoopNode")
        self.add_node(self.llm_node)
        self.add_node(self.human_node)
        self.connect(self.llm_node, self.human_node)

# Run the graph
if __name__ == "__main__":
    graph = HumanInLoopGraph()
    inputs = {"prompt": "Write a short poem about AI."}
    outputs = graph.run(inputs)
    print("Final output after human in the loop:", outputs.get("human_confirmed_response"))
output
LLM response: [AI-generated poem]
Please confirm or modify the response (type your input): [user input]
Final output after human in the loop: [user input]

Common variations

  • Use async nodes for non-blocking human input in web apps.
  • Replace input() with UI callbacks in GUI or web frameworks.
  • Swap gpt-4o with other models like claude-3-5-sonnet-20241022 for different LLM behavior.

Troubleshooting

  • If human input blocks automation, consider async or event-driven input handling.
  • Ensure OPENAI_API_KEY is set correctly to avoid authentication errors.
  • Check LangGraph version compatibility if nodes fail to connect.

Key Takeaways

  • Use LangGraph nodes to insert human decision points in AI workflows.
  • Human input can be collected via console, UI, or async callbacks depending on your app.
  • Always secure API keys via environment variables for safe LLM access.
Verified 2026-04 · gpt-4o, claude-3-5-sonnet-20241022
Verify ↗