LangGraph 1.0 Quick Guide Part 2: Conditional Edges, Memory, and Human‑in‑the‑Loop

This article walks through three advanced LangGraph 1.0 features—using the Command object for conditional routing, checkpoint‑based memory for state persistence across invocations, and interrupt‑driven human‑in‑the‑loop control—providing concrete code examples, execution traces, and a comparison of design trade‑offs.

Fun with Large Models
Fun with Large Models
Fun with Large Models
LangGraph 1.0 Quick Guide Part 2: Conditional Edges, Memory, and Human‑in‑the‑Loop

1. Conditional Edges (Command Object)

The add_conditional_edges function together with a routing function lets the graph decide the next node at runtime. Using the Command object, a node can return both an update (state mutation) and a goto (next node), eliminating the need to pre‑define every edge.

def conditional_edge(state: State) -> Literal['b', 'c', END]:
    select = state["nList"][-1]
    if select == "b":
        return 'b'
    elif select == "c":
        return 'c'
    elif select == 'q':
        return END
    else:
        return END

builder.add_conditional_edges("a", conditional_edge)

A full example shows node a returning a Command that updates the list and jumps to b, c or END based on the input.

def node_a(state: State) -> Command[Literal['b', 'c', END]]:
    select = state["nList"][-1]
    if select == 'b':
        next_node = 'b'
    elif select == 'c':
        next_node = 'c'
    elif select == 'q':
        next_node = END
    else:
        next_node = END
    return Command(update=State(nList=[select]), goto=next_node)

Comparison: the author prefers add_conditional_edges because it separates routing logic from node logic, improving readability and maintainability.

2. Memory via Checkpoints

LangGraph snapshots the full state at key steps (e.g., after each node) using a checkpoint . The snapshot is persisted by a saver (e.g., InMemorySaver), enabling long‑term memory across multiple graph.invoke calls.

from langgraph.checkpoint.memory import InMemorySaver
memory = InMemorySaver()
config = {"configurable": {"thread_id": "1"}}
graph = builder.compile(checkpointer=memory)

During a loop, the same thread_id loads the previous state, so the list nList accumulates inputs like ['b', 'b', 'c', 'c']. The author notes that swapping InMemorySaver for PostgresSaver or a custom saver provides durable storage for production.

3. Human‑in‑the‑Loop via Interrupts

When a node encounters unexpected input, it can call interrupt(), which pauses the graph and stores the snapshot. The external loop detects the special __interrupt__ field, prompts the user, and resumes execution with Command(resume=...).

def node_a(state: State) -> Command[Literal['b', 'c', END]]:
    print('Entering node A')
    select = state['nList'][-1]
    if select == 'b':
        next_node = 'b'
    elif select == 'c':
        next_node = 'c'
    elif select == 'q':
        next_node = END
    else:
        admin = interrupt(f"Unexpected output {select}")
        print('User re‑input:', admin)
        if admin == 'continue':
            next_node = 'b'
            select = 'b'
        else:
            next_node = END
            select = 'q'
    return Command(update=State(nList=[select]), goto=next_node)

The surrounding loop handles the interrupt:

while True:
    user = input('b, c or q to quit:')
    input_state = State(nList=[user])
    result = graph.invoke(input_state, config)
    print(result)
    if '__interrupt__' in result:
        msg = result['__interrupt__'][-1].value
        print(msg)
        human = input(f"
{msg}, re‑enter: ")
        result = graph.invoke(Command(resume=human), config)
    if result['nList'][-1] == 'q':
        print('quit')
        break

Key points: the same thread_id ensures the graph resumes from the saved checkpoint; the interrupt returns a new resume value, allowing the node to make a different decision after human input.

4. Summary

The three advanced capabilities covered are:

Conditional Edges : route dynamically with Command or add_conditional_edges.

Memory : checkpoint snapshots enable state persistence and multi‑turn conversations.

Human‑in‑the‑Loop : interrupt and Command(resume=…) let external users intervene at critical points.

These building blocks let developers create complex, controllable, and stateful AI agents with LangGraph 1.0.

AI agentsMemoryCheckpointLangGraphCommandHuman-in-the-loop
Fun with Large Models
Written by

Fun with Large Models

Master's graduate from Beijing Institute of Technology, published four top‑journal papers, previously worked as a developer at ByteDance and Alibaba. Currently researching large models at a major state‑owned enterprise. Committed to sharing concise, practical AI large‑model development experience, believing that AI large models will become as essential as PCs in the future. Let's start experimenting now!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.