How to intermediate · 3 min read

LangGraph async streaming

Quick answer
Use StateGraph to define your graph and compile it into an async callable app. Then invoke the app with await app.invoke_async() to stream results asynchronously. LangGraph supports async streaming by compiling the graph and using async invocation patterns.

PREREQUISITES

  • Python 3.8+
  • pip install langgraph
  • An async-capable Python environment

Setup

Install the langgraph package via pip and ensure you have Python 3.8 or newer. Async streaming requires an async event loop, so use asyncio or an async framework.

bash
pip install langgraph
output
Collecting langgraph
  Downloading langgraph-0.1.0-py3-none-any.whl (10 kB)
Installing collected packages: langgraph
Successfully installed langgraph-0.1.0

Step by step

Define a StateGraph with async nodes, compile it, and invoke it asynchronously to stream results.

python
import asyncio
from langgraph.graph import StateGraph, END
from typing import TypedDict

class State(TypedDict):
    messages: list[str]

async def async_node(state: State) -> State:
    # Simulate async streaming by yielding partial results
    for i in range(3):
        await asyncio.sleep(0.5)  # simulate async work
        state["messages"].append(f"chunk {i+1}")
    return state

# Create graph and add async node
graph = StateGraph(State)
graph.add_node("async_node", async_node)
graph.set_entry_point("async_node")
graph.add_edge("async_node", END)

# Compile graph to async app
app = graph.compile()

async def main():
    # Invoke async with initial state
    result = await app.invoke_async({"messages": []})
    print("Final messages:", result["messages"])

asyncio.run(main())
output
Final messages: ['chunk 1', 'chunk 2', 'chunk 3']

Common variations

  • Use synchronous nodes with app.invoke() for blocking calls.
  • Integrate with async frameworks like FastAPI by awaiting app.invoke_async().
  • Stream partial results manually inside async nodes by yielding or using callbacks.

Troubleshooting

  • If await app.invoke_async() raises RuntimeError: no running event loop, ensure you run inside an async context or use asyncio.run().
  • If your async node does not stream partial results, implement explicit yielding or callbacks inside the node.

Key Takeaways

  • LangGraph supports async streaming by compiling graphs and invoking them asynchronously with invoke_async().
  • Define async node functions to perform streaming or incremental processing inside the graph.
  • Use asyncio.run() or an async framework to run async LangGraph apps.
  • Streaming partial results requires custom async logic inside nodes; LangGraph itself manages async invocation.
  • Always set entry points and edges correctly to ensure graph execution completes.
Verified 2026-04
Verify ↗