8 Leading LLM Agent Frameworks and How to Plug In MCP Server
This article surveys eight popular large‑language‑model (LLM) agent development frameworks—OpenAI Agents SDK, LangGraph, LlamaIndex, AutoGen, Pydantic AI, SmolAgents, Camel, and CrewAI—explaining each’s key features and providing concrete Python code to integrate the MCP Server for tool access.
The rapid growth of large‑language‑model (LLM) agents has produced many development frameworks. This guide reviews eight mainstream LLM agent frameworks and shows how to connect each one to an MCP Server, enabling agents to call external tools such as web search.
1. OpenAI Agents SDK
Framework Overview : A lightweight SDK released by OpenAI, derived from the internal Swarm project. It focuses on simplicity, handoffs, and guardrails.
MCP Integration :
import asyncio, os
from agents import Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel, RunConfig
from agents.mcp import MCPServerStdio
async def main():
# 1. Create MCP Server instance
search_server = MCPServerStdio(
params={
"command": "npx",
"args": ["-y", "@mcptools/mcp-tavily"],
"env": {**os.environ}
}
)
await search_server.connect()
# 2. Create Agent and attach MCP Server
agent = Agent(
name="助手Agent",
instructions="你是一个具有网页搜索能力的助手,必要时使用搜索工具获取信息。",
mcp_servers=[search_server]
)
# 3. Run Agent – it will decide when to call the search tool
result = await Runner.run(agent, "Llama4.0发布了吗?", run_config=RunConfig(tracing_disabled=True))
print(result.final_output)
await search_server.cleanup()
if __name__ == "__main__":
asyncio.run(main())The SDK also supports automatic tool‑list caching via cache_tools_list=True and manual cache invalidation with invalidate_tools_cache().
2. LangGraph
Framework Overview : Built on LangChain, LangGraph models agentic workflows as stateful graphs, allowing complex, structured interactions.
MCP Integration :
import asyncio, os
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_core.messages import SystemMessage, HumanMessage
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
from dotenv import load_dotenv
load_dotenv()
model = ChatOpenAI(model="gpt-4o-mini")
async def run_agent():
async with MultiServerMCPClient({
"tavily": {
"command": "npx",
"args": ["-y", "@mcptools/mcp-tavily"],
"env": {**os.environ}
}
}) as client:
agent = create_react_agent(model, client.get_tools())
system_message = SystemMessage(content="你是一个具有网页搜索能力的助手,必要时使用搜索工具获取信息。")
response = await agent.ainvoke({"messages": [system_message, HumanMessage(content="Llama4.0发布了吗?")]})
return response["messages"][-1].content
if __name__ == "__main__":
print("
最终回答:", asyncio.run(run_agent()))3. LlamaIndex
Framework Overview : Initially focused on data‑centric RAG applications, LlamaIndex now offers full‑stack agent capabilities via Workflows and AgentWorkflow.
MCP Integration :
from llama_index.tools.mcp import McpToolSpec, BasicMCPClient
import asyncio, os
from llama_index.llms.openai import OpenAI
from llama_index.core.agent import ReActAgent
llm = OpenAI(model="gpt-4o-mini")
async def main():
mcp_client = BasicMCPClient(
"npx",
["-y", "@mcptools/mcp-tavily"],
env={**os.environ}
)
mcp_tool = McpToolSpec(client=mcp_client)
tools = await mcp_tool.to_tool_list_async()
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True,
system_prompt="你是一个具有网页搜索能力的助手,必要时使用搜索工具获取信息。")
response = await agent.aquery("Llama4.0发布了吗?")
print(response)
if __name__ == "__main__":
asyncio.run(main())For remote SSE‑based MCP servers, replace the command/args with a URL when constructing BasicMCPClient.
4. AutoGen 0.4+
Framework Overview : Microsoft’s AutoGen enables multi‑agent collaboration with a new Core API layer for fine‑grained control. It is powerful but complex.
MCP Integration (simplified example):
from autogen_ext.tools.mcp import StdioServerParams, mcp_server_tools
async def get_mcp_tools():
server_params = StdioServerParams(
command="npx",
args=["-y", "@mcptools/mcp-tavily"],
env={**os.environ}
)
tools = await mcp_server_tools(server_params)
return tools
# Further agent setup would use these tools via AutoGen’s APIs.Use SseServerParams with a URL for remote MCP servers.
5. Pydantic AI
Framework Overview : Built by the creators of Pydantic, this framework leverages strong type validation and structured output for LLM agents.
MCP Integration :
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStdio
import os, asyncio
server = MCPServerStdio(
"npx",
["-y", "@mcptools/mcp-tavily"],
env={**os.environ}
)
agent = Agent(
name="助手Agent",
system_prompt="你是一个具有网页搜索能力的助手,必要时使用搜索工具获取信息。",
model='openai:gpt-4o-mini',
mcp_servers=[server]
)
async def main():
async with agent.run_mcp_servers():
result = await agent.run('"Llama4.0发布了吗?"')
print(result.data)
if __name__ == "__main__":
asyncio.run(main())Switch to MCPServerHTTP for SSE‑based remote servers.
6. SmolAgents
Framework Overview : A lightweight Hugging Face project that uses code‑generated tool calls (CodeAgent) and integrates tightly with the HF ecosystem.
MCP Integration :
from smolagents import ToolCollection, CodeAgent, tool, LiteLLMModel
from mcp import StdioServerParameters
import os, asyncio
model = LiteLLMModel(model_id="gpt-4o-mini")
server_parameters = StdioServerParameters(
command="npx",
args=["-y", "@mcptools/mcp-tavily"],
env={**os.environ}
)
with ToolCollection.from_mcp(server_parameters, trust_remote_code=True) as tool_collection:
agent = ToolCallingAgent(tools=[*tool_collection.tools], model=model)
response = agent.run("llama4.0发布了吗?")
print(response)Replace the command/args with a URL for SSE mode.
7. Camel
Framework Overview : Provides role‑playing multi‑agent collaboration and includes components for building RAG applications.
MCP Integration :
import asyncio, os
from mcp.types import CallToolResult
from camel.toolkits.mcp_toolkit import MCPToolkit, MCPClient
from camel.agents import ChatAgent
async def run_example():
mcp_client = MCPClient(
command_or_url="npx",
args=["-y", "@mcptools/mcp-tavily"],
env={**os.environ}
)
await mcp_client.connect()
mcp_toolkit = MCPToolkit(servers=[mcp_client])
tools = mcp_toolkit.get_tools()
try:
agent = ChatAgent(system_message='根据任务描述,使用网页搜索工具获取信息。', tools=tools)
response = await agent.astep("llama4.0发布了吗?")
print("Response:", response.msgs[0].content)
finally:
await mcp_client.disconnect()
if __name__ == "__main__":
asyncio.run(run_example())Use a URL in MCPClient for remote SSE servers.
8. CrewAI
Framework Overview : Enables teams of agents with distinct roles to cooperate on complex tasks; recent Flow feature improves workflow reliability.
MCP Integration (via third‑party adapter):
import os
from crewai import Agent, Crew, Task
from mcp import StdioServerParameters
from mcpadapt.core import MCPAdapt
from mcpadapt.crewai_adapter import CrewAIAdapter
with MCPAdapt(
StdioServerParameters(
command="npx",
args=["-y", "@mcptools/mcp-tavily"],
env={**os.environ}
),
CrewAIAdapter(),
) as tools:
agent = Agent(
role="MyAgent",
goal="根据任务描述,使用网页搜索工具获取信息。",
backstory="你是一个中文搜索助手",
tools=tools,
llm='gpt-4o-mini'
)
task = Task(description="llama4.0的最新消息", agent=agent, expected_output="消息列表")
task.execute_sync()The official MCP adapter for CrewAI is still under development; see the related GitHub PR #2496 for progress.
All eight frameworks support MCP integration, though the exact API differs. Developers should consult the latest documentation of each framework for updates.
AI Large Model Application Practice
Focused on deep research and development of large-model applications. Authors of "RAG Application Development and Optimization Based on Large Models" and "MCP Principles Unveiled and Development Guide". Primarily B2B, with B2C as a supplement.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
