Building a Basic Chatbot with LangGraph: Step‑by‑Step AI Agent Tutorial
This article walks through building AI agents with LangGraph in Python, starting with a simple GCD workflow and then creating a memory‑enabled chatbot using GPT‑4o, covering state management, nodes, edges, conditional loops, recursion limits, and visual debugging.
01 Exploring LangGraph: From LangChain to Agents
LangChain is a framework that lets you use base models such as GPT or Claude and equip them with tools for reasoning, planning, and execution. LangGraph builds on this by giving you full control over workflows: you can create custom states, nodes, edges, conditions, and middleware. While LangChain is suited for simple linear chains that require manual memory handling, LangGraph adds out‑of‑the‑box state management, looping workflows, and human supervision.
LangGraph’s graph architecture represents an agent workflow as a directed graph, supporting loops and conditional branches while preserving shared state between components. LangSmith, a platform for continuous testing, is mentioned but not used in this tutorial.
02 First Building Block: Your First LangGraph Workflow
We start with a classic example—computing the greatest common divisor (GCD) of two numbers—to illustrate core LangGraph concepts.
a = int(input("a = "))
b = int(input("b = "))
while b != 0:
a, b = b, a % b
print("GCD = ", a)In LangGraph, State acts as global memory accessible to all nodes. For the GCD workflow we define:
from typing import TypedDict
class State(TypedDict):
a: int
b: intNodes are functions that receive a copy of the current state and return a partial state update. For example:
def get_user_data(_: State) -> State:
"""Get user input numbers"""
a = int(input("a = "))
b = int(input("b = "))
return {"a": a, "b": b}
def modify(state: State) -> State:
"""Perform one step of Euclid's algorithm"""
a, b = state["a"], state["b"]
a, b = b, a % b
return {"a": a, "b": b}
def write(state: State) -> State:
"""Output the final result"""
print("GCD = ", state["a"])
return {}Edges define execution order. We add a START edge to the input node, conditional edges to loop while b != 0, and an END edge after writing the result. Because LangGraph limits recursion depth, we increase the limit when invoking the workflow: workflow.invoke({}, {"recursion_limit": 100}) Running the program with a = 64 and b = 240 yields GCD = 16.
03 Smart Upgrade: Building a Memory‑Enabled Chatbot
Next we create a multi‑turn chatbot using LangGraph and OpenAI’s gpt‑4o. The environment is set up with uv for isolated virtual environments and the required libraries.
from langgraph.graph import StateGraph, START, END
from langchain.chat_models import init_chat_model
from langchain_core.messages import HumanMessage, AIMessage
from langchain.agents import AgentState
from dotenv import load_dotenv
import os
load_dotenv()We define a custom state that inherits from AgentState so that the messages field automatically appends new messages:
class State(AgentState):
iteration: int # track conversation roundsThe core conversation node interacts with the LLM and updates the state:
ITERATION_LIMIT = 5
model = init_chat_model("openai:gpt-4o")
def ask_llm(state: State) -> State:
"""Core node that talks to the LLM"""
user_query = input("query: ")
user_message = HumanMessage(user_query)
answer_message: AIMessage = model.invoke(state["messages"] + [user_message])
print("answer:", answer_message.content)
return {"messages": [user_message, answer_message], "iteration": state["iteration"] + 1}We build the graph, add a START edge to ask_llm, and a conditional edge that repeats the node while iteration < ITERATION_LIMIT. When the limit is reached the workflow ends.
graph = StateGraph(State)
graph.add_node("ask_llm", ask_llm)
graph.add_edge(START, "ask_llm")
graph.add_conditional_edges(
"ask_llm",
lambda state: state["iteration"] < ITERATION_LIMIT,
{True: "ask_llm", False: END},
)
workflow = graph.compile()
initial_state = {"iteration": 0, "messages": []}
workflow.invoke(initial_state, {"recursion_limit": 100})Sample interaction:
query: 你好,我是张三
answer: 你好张三!很高兴认识你。有什么我可以帮助你的吗?
query: 你还记得我的名字吗?
answer: 当然记得,你刚刚告诉我你叫张三。今天有什么需要我协助的呢?
query: 2+2等于多少?
answer: 2+2等于4。还有其他数学问题吗?
...The chatbot now maintains full conversation history via state["messages"] and avoids infinite loops by tracking iteration.
04 Understanding LangGraph’s Core Advantages
Loop handling : Unlike linear LangChain chains, LangGraph supports complex loops and conditional branches that match natural multi‑turn dialogues.
Built‑in state management : AgentState and reducers like add_messages automatically merge new messages instead of overwriting.
Visualization and debugging : workflow.get_graph().draw_mermaid_png() generates a PNG of the workflow, making logic inspection straightforward.
Flexible node design : Each node can focus on a single responsibility, enabling dedicated tool‑calling, condition checking, or result validation steps.
Production‑ready features : Checkpoint persistence lets an agent resume after interruptions, essential for long‑running tasks.
05 From Simple to Complex: Advanced LangGraph Applications
Beyond the introductory examples, LangGraph can power sophisticated systems such as:
Multi‑agent collaboration where specialized agents (search, analysis, reporting) coordinate via a shared graph.
Tool‑calling agents that invoke external services (search engines, databases, APIs) based on tool_calls attributes.
Human‑in‑the‑loop workflows that pause execution with interrupt for manual review in high‑risk domains.
Adaptive Retrieval‑Augmented Generation (RAG) that switches between direct answering and document retrieval depending on query complexity.
Enterprise approval pipelines where each approval step maps to a LangGraph node, providing transparent routing and state tracking.
Conclusion
LangGraph lifts LLM development from simple question‑answering to full‑featured agent systems. By exposing states, nodes, and edges, developers gain fine‑grained control, reliability, and extensibility for building autonomous, tool‑aware, and context‑aware AI applications.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Data STUDIO
Click to receive the "Python Study Handbook"; reply "benefit" in the chat to get it. Data STUDIO focuses on original data science articles, centered on Python, covering machine learning, data analysis, visualization, MySQL and other practical knowledge and project case studies.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
