Boosting LangChain Agents with MCP: Stdio vs SSE Integration Guide
This article walks through setting up MCP servers in both stdio and SSE modes, shows how to load their tools into LangChain agents using the new create_agent API, compares the two communication methods, and provides complete runnable Python examples.
MCP technology surged earlier this year and, after a cooling period, can now be combined with LangChain to extend agent capabilities by accessing thousands of online MCP services. LangChain 1.0 introduced a unified create_agent function that replaces older helpers like create_tool_calling_agent, create_react_agent, etc.
Prerequisites
Install the required packages:
pip install langchain langchain-openai langchain-classic langchain-mcp-adapters mcpEnsure python3 is available.
Common Components
LLM configuration : Both examples use ChatOpenAI with the deepseek-v3 model and temperature 0. The API key and base URL are set via environment variables, e.g.
export OPENAI_API_KEY=bce-v3/abcsfsfdskgergerthntjrweeuidfu8324refbif3
export OPENAI_API_BASE=https://qianfan.baidubce.com/v2Agent prompt : A ChatPromptTemplate defines a system message, optional chat history, a human input placeholder, and an agent scratchpad.
prompt = ChatPromptTemplate.from_messages([
("system", "你是一个可以使用工具的得力助手。 தயவுசெய்து கருவிகளைப் பயன்படுத்தவும்."),
MessagesPlaceholder("chat_history", optional=True),
("human", "{input}"),
MessagesPlaceholder("agent_scratchpad"),
])Agent creation and execution : Use create_tool_calling_agent (or later create_agent) together with AgentExecutor.
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True) # verbose=False for SSETask : The agent receives the query "123 + 456 等于多少?".
MCP Services
Stdio Service
import asyncio
from mcp.server import FastMCP
# Create a server instance
server = FastMCP(name="math_server", log_level="ERROR")
@server.tool()
def add(a: int, b: int) -> int:
"""将两个整数相加。"""
return a + b
async def main():
await server.run_stdio_async()
if __name__ == "__main__":
asyncio.run(main())SSE Service
import asyncio
from typing import Annotated
from mcp.server import FastMCP
server = FastMCP(name="math_server", instructions="一个可以做加法的简单数学服务器。", log_level="ERROR")
@server.tool()
def add(a: Annotated[int, "第一个整数"], b: Annotated[int, "第二个整数"]) -> int:
"""将两个整数相加。"""
return a + b
async def main():
await server.run_sse_async()
if __name__ == "__main__":
asyncio.run(main())Example 1 – Stdio
Purpose: Run a local MCP server that communicates via standard input/output, suitable for single‑process interaction.
Define the stdio MCP server (see Stdio Service code).
Create a stdio_client with StdioServerParameters pointing to the server script.
Load tools from the server using load_mcp_tools.
Build the LangChain agent and invoke it with the math query.
Key code snippet:
mcp_server_path = Path(__file__).parent / "mcp_math_server_stdio.py"
server_params = StdioServerParameters(command="python3", args=[str(mcp_server_path)])
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
tools = await load_mcp_tools(session)
# Agent creation (see Common Components) and execution
response = await agent_executor.ainvoke({"input": "123 + 456 等于多少?", "chat_history": []})
print(f"Agent 回答: {response['output']}")Run with python3 example_1_mcp_tool_stdio.py. The console shows the agent’s reasoning steps and final answer.
Example 2 – SSE
Purpose: Launch an MCP server that exposes tools over HTTP Server‑Sent Events, ideal for network‑distributed tool services.
Start the SSE server as a separate subprocess.
Wait briefly for the server to become ready.
Connect using MultiServerMCPClient with transport "sse" and the endpoint URL.
Retrieve tools, build the agent, and invoke the same math query.
Key code snippet:
client = MultiServerMCPClient({"math": {"transport": "sse", "url": "http://localhost:8000/sse"}})
tools = await client.get_tools()
# Agent creation (same as above) and execution
response = await agent_executor.ainvoke({"input": "123 + 456 等于多少?", "chat_history": []})
print(f"Agent 回答: {response['output']}")
# Clean up
server_process.terminate()
await server_process.wait()Run with python3 example_1_mcp_tool_sse.py. The verbose flag is set to False to suppress excessive SSE subprocess output.
Comparison – Stdio vs SSE
Stdio
Simple to set up; best for tightly‑coupled local components.
Server runs as a child process of the client.
Use case: embedding a tool server directly inside an application.
SSE
More flexible; server can run on a separate machine.
Scalable as part of a micro‑service architecture.
Use case: shared tool services accessed by multiple clients over the web.
Both methods enable LangChain agents to discover and use MCP‑provided tools, offering deployment flexibility.
Using the New create_agent API (LangChain 1.0)
The updated create_agent function accepts an LLM, a list of tools, and a system prompt, returning a graph object that can stream responses.
Example 3 – MCP SSE with create_agent
Starts the SSE server, fetches tools via MultiServerMCPClient, configures ChatOpenAI, and builds the agent:
graph = create_agent(
model=llm,
tools=tools,
system_prompt="你是一个可以使用工具的得力助手。 தயவுசெய்து கருவிகளைப் பயன்படுத்தவும்."
)
for chunk in graph.astream({"messages": [{"role": "user", "content": "123 + 456 等于多少?"}]}, stream_mode="updates"):
print(chunk)Example 4 – MCP Stdio with create_agent
Uses a stdio client to connect to the local server, loads tools with load_mcp_tools, and runs the same streaming agent logic as Example 3.
# Same graph creation as above, but tools come from stdio_client
# Streaming loop identical to Example 3These four examples demonstrate how to integrate MCP tools into LangChain agents via both stdio and SSE, and how the newer create_agent API simplifies agent definition while supporting both communication modes.
For the full source code and additional examples, see the LangChain development repository: https://github.com/your-repo/langchain-mcp-examples
BirdNest Tech Talk
Author of the rpcx microservice framework, original book author, and chair of Baidu's Go CMC committee.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
