Human-in-the-loop with LangGraph
Quick answer
Use
LangGraph to build stateful AI workflows that incorporate human feedback by defining nodes that pause for user input and resume processing. Implement human-in-the-loop by adding nodes that collect user responses and conditionally proceed based on that input within the StateGraph framework.PREREQUISITES
Python 3.8+pip install langgraphOpenAI API key (free tier works)pip install openai>=1.0Set environment variable OPENAI_API_KEY
Setup
Install langgraph and openai Python packages, and set your OpenAI API key as an environment variable.
- Run
pip install langgraph openai - Export your API key:
export OPENAI_API_KEY='your_api_key_here'(Linux/macOS) or set in Windows environment variables
pip install langgraph openai output
Collecting langgraph Collecting openai Successfully installed langgraph-0.x.x openai-1.x.x
Step by step
This example demonstrates a simple human-in-the-loop workflow using LangGraph. The graph pauses to ask the user a question, waits for input, then continues processing based on the response.
import os
from langgraph.graph import StateGraph, END
from typing import TypedDict
# Define the state schema
class State(TypedDict):
messages: list
user_input: str
# Node that sends a prompt and waits for human input
def ask_user(state: State) -> State:
print("AI: What is your favorite programming language?")
user_response = input("User: ")
return {"messages": state["messages"] + ["Asked user"], "user_input": user_response}
# Node that processes the human input
def process_input(state: State) -> State:
response = f"AI: You said your favorite language is {state['user_input']}!"
print(response)
return {"messages": state["messages"] + [response], "user_input": state["user_input"]}
# Build the graph
graph = StateGraph(State)
graph.add_node("ask_user", ask_user)
graph.add_node("process_input", process_input)
graph.set_entry_point("ask_user")
graph.add_edge("ask_user", "process_input")
graph.add_edge("process_input", END)
# Run the graph
initial_state = {"messages": [], "user_input": ""}
result = graph.invoke(initial_state)
print("Workflow complete.") output
AI: What is your favorite programming language? User: Python AI: You said your favorite language is Python! Workflow complete.
Common variations
You can extend human-in-the-loop workflows by integrating asynchronous input, streaming AI responses, or using different LLM providers.
- Async input: Use async functions and event loops to handle user input in web or GUI apps.
- Streaming AI: Stream partial AI responses while waiting for human feedback.
- Different models: Use
OpenAIorAnthropicclients inside nodes for dynamic AI generation.
import asyncio
from langgraph.graph import StateGraph, END
from typing import TypedDict
class State(TypedDict):
messages: list
user_input: str
async def ask_user_async(state: State) -> State:
print("AI: Please enter your feedback asynchronously:")
user_response = await asyncio.to_thread(input, "User: ")
return {"messages": state["messages"] + ["Asked user async"], "user_input": user_response}
async def main():
graph = StateGraph(State)
graph.add_node("ask_user_async", ask_user_async)
graph.set_entry_point("ask_user_async")
graph.add_edge("ask_user_async", END)
initial_state = {"messages": [], "user_input": ""}
result = await graph.invoke(initial_state)
print("Async workflow complete.")
asyncio.run(main()) output
AI: Please enter your feedback asynchronously: User: Great job! Async workflow complete.
Troubleshooting
- If
input()blocks indefinitely, ensure your environment supports interactive input (avoid running in non-interactive shells). - If you get type errors, verify your
StateTypedDict matches the data passed between nodes. - For API key errors, confirm
OPENAI_API_KEYis set correctly in your environment.
Key Takeaways
- Use
StateGraphnodes to pause and collect human input for human-in-the-loop workflows. - Define a
TypedDictstate schema to track messages and user responses across nodes. - Integrate async functions for non-blocking human input in web or GUI applications.
- Combine LangGraph with OpenAI or Anthropic clients inside nodes for dynamic AI-human interaction.
- Always test in an interactive environment to avoid blocking on
input()calls.