LangGraph AI Agent Tutorial Part 9: Managing Short‑ and Long‑Term Memory
This tutorial explains how LangGraph handles short‑term memory with InMemorySaver and thread‑isolated checkpoints, how to persist it using PostgresSaver, and how to build long‑term memory with InMemoryStore and PostgresStore, including semantic search and custom storage extensions.
Short‑Term Memory
LangGraph stores the messages list in a State object. The default InMemorySaver keeps this data in memory and isolates it per thread. Creating a new conversation starts a new thread, so previous messages are not visible to the new thread.
from langchain.chat_models import init_chat_model
from langgraph.checkpoint.memory import InMemorySaver
from langgraph.prebuilt import create_react_agent
checkpointer = InMemorySaver()
model = init_chat_model(model="deepseek-chat", model_provider="deepseek", api_key="YOUR_DEEPSEEK_API_KEY")
agent = create_react_agent(model=model, tools=[], checkpointer=checkpointer)When invoking the agent, the configurable.thread_id selects the thread (and thus the short‑term memory) to use:
config = {"configurable": {"thread_id": "1"}}
response = agent.invoke({"messages": [{"role": "user", "content": "你好,我叫苍老师"}]}, config)
print(response['messages'][-1].content)
new_config = {"configurable": {"thread_id": "2"}}
response = agent.invoke({"messages": [{"role": "user", "content": "请问你还记得我叫什么名字吗?"}]}, new_config)
print(response['messages'][-1].content)Running the code shows that thread 1 remembers the name while thread 2 does not.
Persisting Short‑Term Memory with Postgres
To retain memory after the process exits, use a checkpoint saver backed by a database, e.g., Postgres:
pip install -U "psycopg[binary,pool]" langgraph-checkpoint-postgres from langgraph.checkpoint.postgres import PostgresSaver
DB_URI = "postgresql://postgres:postgres@localhost:5442/postgres?sslmode=disable"
with PostgresSaver.from_conn_string(DB_URI) as checkpointer:
checkpointer.setup()
# reuse the same create_react_agent flow, passing checkpointerLong‑Term Memory
Long‑term memory survives across threads and sessions. LangGraph provides InMemoryStore for simple use‑cases and PostgresStore for durable storage.
InMemoryStore API
Data is stored under a namespace (a tuple) and a key . Example of put:
store.put(("users",), "user_123", {"name": "John Smith", "language": "English"})Retrieving with get:
user_info = store.get(("users",), "user_123")
print(user_info.value)Semantic search (requires an embedding model):
from langchain.embeddings import init_embeddings
from langgraph.store.memory import InMemoryStore
embeddings = init_embeddings("bge-m3:latest", provider='ollama')
store = InMemoryStore(index={"embed": embeddings, "dims": 1024})
store.put(("user_123", "memories"), "1", {"text": "我爱吃汉堡"})
store.put(("user_123", "memories"), "2", {"text": "我是苍进空"})
results = store.search(("user_123", "memories"), query="I'm hungry", limit=1)
print(results)Persisting Long‑Term Memory with PostgresStore
Combine PostgresStore with PostgresSaver for production‑grade persistence:
from langgraph.store.postgres import PostgresStore
from langgraph.checkpoint.postgres import PostgresSaver
DB_URI = "postgresql://postgres:postgres@localhost:5442/postgres?sslmode=disable"
with (
PostgresStore.from_conn_string(DB_URI) as store,
PostgresSaver.from_conn_string(DB_URI) as checkpointer,
):
store.setup()
checkpointer.setup()
def call_model(state: MessagesState, config: RunnableConfig, *, store: BaseStore):
user_id = config["configurable"]["user_id"]
namespace = ("memories", user_id)
memories = store.search(namespace, query=str(state["messages"][-1].content))
info = "
".join([d.value["data"] for d in memories])
system_msg = f"You are a helpful assistant. User info: {info}"
if "记住" in state["messages"][-1].content.lower():
store.put(namespace, str(uuid.uuid4()), {"data": "用户名字是苍老师"})
response = model.invoke([{"role": "system", "content": system_msg}] + state["messages"])
return {"messages": response}
builder = StateGraph(MessagesState)
builder.add_node(call_model)
builder.add_edge(START, "call_model")
graph = builder.compile(checkpointer=checkpointer, store=store)
# First conversation – store name
cfg1 = {"configurable": {"thread_id": "1", "user_id": "1"}}
resp1 = graph.invoke({"messages": [{"role": "user", "content": "你好,记住: 我叫苍老师"}]}, cfg1)
print(resp1['messages'][-1])
# Second conversation – different thread, same user_id
cfg2 = {"configurable": {"thread_id": "2", "user_id": "1"}}
resp2 = graph.invoke({"messages": [{"role": "user", "content": "我的名字是什么?"}]}, cfg2)
print(resp2['messages'][-1])Workflow:
The first turn detects the keyword "记住" and stores "用户名字是苍老师" in the Postgres database.
The second turn, even with a different thread ID, retrieves the stored name from long‑term memory and replies accordingly.
Advanced Customisation
LangGraph’s checkpoint and store abstractions ( BaseCheckpointSaver, BaseStore) can be subclassed to integrate custom back‑ends such as SQLite, MongoDB, or proprietary databases. Implementations must follow the interfaces defined in BaseCheckpointSaver and BaseStore (see LangGraph reference).
Conclusion
This summary demonstrates short‑term memory using InMemorySaver (in‑memory) and PostgresSaver (persistent), and long‑term memory using InMemoryStore and PostgresStore, including semantic search and custom storage extensions. By combining these mechanisms, developers can build agents that retain context across conversations and sessions.
Fun with Large Models
Master's graduate from Beijing Institute of Technology, published four top‑journal papers, previously worked as a developer at ByteDance and Alibaba. Currently researching large models at a major state‑owned enterprise. Committed to sharing concise, practical AI large‑model development experience, believing that AI large models will become as essential as PCs in the future. Let's start experimenting now!
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
