Mastering ChatPromptTemplate: Build Memory‑Enabled Conversational Agents with LangChain
This guide explains how to use LangChain's ChatPromptTemplate and MessagesPlaceholder to construct multi‑turn, memory‑aware chatbots, detailing core message templates, dynamic history insertion, and multiple creation methods with concrete Python code examples.
When building conversational applications such as chatbots, handling multi‑turn dialogue history requires a prompt structure that can insert a dynamic list of messages. ChatPromptTemplate is designed for this purpose and works together with a ChatModel as a standard tool.
Core components of ChatPromptTemplate
A ChatPromptTemplate consists of one or more message templates, each representing a role in the conversation:
SystemMessagePromptTemplate : creates system messages that set the AI's role, background, and behavior guidelines.
HumanMessagePromptTemplate : creates user messages, representing the human input.
AIMessagePromptTemplate : creates AI messages and can include few‑shot examples to steer the model toward a specific answer format.
MessagesPlaceholder : the key to handling dialogue history
In multi‑turn conversations we cannot know in advance how many turns will occur, so a placeholder is needed to indicate where a dynamic message list should be inserted. MessagesPlaceholder defines a variable that, at runtime, is replaced by a list of messages (e.g., the chat history stored in memory).
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.messages import HumanMessage, AIMessage
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant."),
MessagesPlaceholder(variable_name="chat_history"), # placeholder for history
("human", "{input}"), # current user input
])
# At runtime, "chat_history" is replaced with a list of messages
prompt.format_messages(
chat_history=[
HumanMessage(content="Hello!"),
AIMessage(content="Hello! How can I help you?"),
],
input="What is LangChain?"
) MessagesPlaceholderis therefore the core component for building chatbots with memory.
Creating a ChatPromptTemplate
The most common way to create a ChatPromptTemplate is via the from_messages class method, which accepts several input formats:
Tuple list (type, content) :
ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant."),
("human", "{question}"),
])List of BaseMessagePromptTemplate objects :
from langchain.prompts import SystemMessagePromptTemplate, HumanMessagePromptTemplate
ChatPromptTemplate.from_messages([
SystemMessagePromptTemplate.from_template("..."),
HumanMessagePromptTemplate.from_template("..."),
])List of BaseMessage objects :
from langchain_core.messages import SystemMessage, HumanMessage
ChatPromptTemplate.from_messages([
SystemMessage(content="..."),
HumanMessage(content="{question}"), # only variables are formatted
])In the examples of this chapter we focus on using MessagesPlaceholder to manage dynamic dialogue history, which is a crucial step for building real chat applications.
Reference
How to: use few‑shot examples in chat models [1] – https://python.langchain.com/docs/how_to/few_shot_chat_examples
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
BirdNest Tech Talk
Author of the rpcx microservice framework, original book author, and chair of Baidu's Go CMC committee.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
