How MCP Turns AI Models into a Universal USB Interface
Introducing MCP (Model Context Protocol), an open standard released by Anthropic that unifies AI model interaction with external tools, databases, and services through a USB‑like interface, the article dissects its design goals, architecture, message types, Python SDK implementation, client integration, production best practices, and future roadmap.
Why MCP Is Needed
Core Challenges of AI Applications
Current AI systems are isolated; they can only use the knowledge baked into the model at training time and cannot directly interact with the external world. Retrieval‑augmented generation (RAG) solves knowledge acquisition, but tool invocation, data writing, and multi‑system coordination still lack a unified standard.
Typical scenarios include:
AI reading real‑time data from a database
AI calling external APIs to perform specific actions
AI writing data to multiple external systems
AI subscribing to real‑time event streams
Each scenario currently requires custom tool‑call formats, bespoke API adapters, and ad‑hoc error handling, leading to duplicated effort.
Core Value of MCP
MCP positions itself as the "USB interface for the AI world," offering a standardized way for any AI model to interact with any external system.
USB Interface solves:
- No need to design a dedicated interface for each device
- Plug‑and‑play usage
- Standardisation enables ecosystem growth
MCP solves:
- No need to design a dedicated adapter for each external system
- One tool integration works for all AI models
- Standardisation promotes ecosystem flourishingMCP Protocol Architecture
Core Components
┌─────────────────────────────────────────────────────────────┐
│ AI Application │
│ ┌─────────────────────────────────────────────────────┐ │
│ │ MCP Client │ │
│ │ - Maintains connection to Server │ │
│ │ - Serialises / deserialises messages │ │
│ │ - Handles protocol handshake │ │
│ └─────────────────────────────────────────────────────┘ │
└──────────────────────────┬──────────────────────────────────┘
│ stdio / HTTP + SSE
┌──────────────────────────▼──────────────────────────────────┐
│ MCP Server (process) │
│ ┌─────────────────────────────────────────────────────┐ │
│ │ Protocol Engine │ │
│ │ - Message routing │ │
│ │ - Capability negotiation │ │
│ │ - Error handling │ │
│ └─────────────────────────────────────────────────────┘ │
│ ┌─────────────────────────────────────────────────────┐ │
│ │ Tool Handler │ │
│ │ - Exposes tool list │ │
│ │ - Executes tool calls │ │
│ │ - Formats results │ │
│ └─────────────────────────────────────────────────────┘ │
│ ┌─────────────────────────────────────────────────────┐ │
│ │ Resource Handler │ │
│ │ - Manages readable resources │ │
│ │ - Handles resource subscriptions │ │
│ │ - Caches content │ │
│ └─────────────────────────────────────────────────────┘ │
└──────────────────────────┬──────────────────────────────────┘
│
┌──────────────────────────▼──────────────────────────────────┐
│ External Systems │
│ - REST APIs - Databases - File Systems - etc. │
└─────────────────────────────────────────────────────────────┘Communication Protocols
MCP supports two transport modes.
stdio mode (local/CLI tools)
AI App → MCP Client → stdin → MCP Server → External System
↑ ↓
└────── stdout ←───────┘HTTP + SSE mode (remote services)
AI App → MCP Client ──── HTTP POST /message ──→ MCP Server
↑ ↓
└──────── SSE (Server‑Sent Events) ←──────────┘Message Types
MCP defines four core JSON‑RPC messages.
// 1. Initialize – handshake
{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2024-11-05",
"capabilities": {"roots":{},"tools":{}},
"clientInfo": {"name": "claude-desktop", "version": "1.0"}
}
}
// 2. Tools/List – list available tools
{ "jsonrpc": "2.0", "id": 2, "method": "tools/list" }
// 3. Tools/Call – invoke a tool
{ "jsonrpc": "2.0", "id": 3, "method": "tools/call", "params": {"name": "read_file", "arguments": {"path": "/etc/passwd"}} }
// 4. Resources – list resources
{ "jsonrpc": "2.0", "id": 4, "method": "resources/list" }MCP Server Development
Python SDK Implementation
from mcp.server.fastmcp import FastMCP
# Initialise MCP Server
mcp = FastMCP("filesystem-server")
@mcp.tool()
def read_file(path: str, encoding: str = "utf-8") -> str:
"""Read file content"""
with open(path, 'r', encoding=encoding) as f:
return f.read()
@mcp.tool()
def write_file(path: str, content: str) -> dict:
"""Write file content"""
with open(path, 'w', encoding='utf-8') as f:
f.write(content)
return {"success": True, "path": path, "bytes": len(content)}
@mcp.tool()
def list_directory(path: str) -> list[dict]:
"""List directory contents"""
import os
items = []
for item in os.listdir(path):
full_path = os.path.join(path, item)
items.append({
"name": item,
"type": "dir" if os.path.isdir(full_path) else "file",
"size": os.path.getsize(full_path) if os.path.isfile(full_path) else None
})
return items
@mcp.resource("file://{path}")
def file_resource(path: str) -> str:
"""Dynamic resource access"""
with open(path, 'r') as f:
return f.read()
if __name__ == "__main__":
mcp.run()Tools with Complex Parameters
@mcp.tool()
def search_codebase(
query: str,
file_pattern: str = "*.py",
case_sensitive: bool = False,
max_results: int = 50
) -> list[dict]:
"""Search code in a codebase"""
import glob, os
results = []
for file_path in glob.glob(file_pattern, recursive=True):
if not os.path.isfile(file_path):
continue
try:
with open(file_path, 'r', encoding='utf-8', errors='ignore') as f:
lines = f.readlines()
for i, line in enumerate(lines):
match = query in line if case_sensitive else query.lower() in line.lower()
if match:
results.append({
"file": file_path,
"line": i + 1,
"content": line.strip(),
"context": lines[max(0, i-2):i] + lines[i+1:min(len(lines), i+3)]
})
if len(results) >= max_results:
return results
except Exception:
continue
return resultsStreaming Response Tool
from typing import AsyncIterator
@mcp.tool()
async def stream_log_tail(path: str, lines: int = 100) -> AsyncIterator[dict]:
"""Stream the last N lines of a log file"""
import asyncio
with open(path, 'r') as f:
all_lines = f.readlines()
last_lines = all_lines[-lines:]
for line in last_lines:
yield {"content": line.strip()}
await asyncio.sleep(0.1) # simulate real‑time pushMCP Client Integration
Python Client Implementation
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
import asyncio
class MCPClient:
def __init__(self, server_command: list[str]):
self.server_command = server_command
async def initialize(self):
server_params = StdioServerParameters(
command=self.server_command[0],
args=self.server_command[1:]
)
async with stdio_client(server_params) as (read, write):
self.session = ClientSession(read, write)
await self.session.initialize()
response = await self.session.get_server_capabilities()
print(f"Server capabilities: {response.capabilities}")
async def list_tools(self) -> list[dict]:
response = await self.session.list_tools()
return [tool.model_dump() for tool in response.tools]
async def call_tool(self, tool_name: str, arguments: dict) -> dict:
result = await self.session.call_tool(tool_name, arguments)
return result.content[0].model_dump()
async def main():
client = MCPClient(["python", "filesystem_server.py"])
await client.initialize()
tools = await client.list_tools()
print("Available tools:", [t["name"] for t in tools])
result = await client.call_tool("read_file", {"path": "/tmp/test.txt"})
print("File content:", result["text"])
asyncio.run(main())LangChain Integration
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
# Initialise MCP client with multiple servers
client = MultiServerMCPClient({
"filesystem": {"command": "python", "args": ["/path/to/filesystem_server.py"]},
"database": {"command": "python", "args": ["/path/to/database_server.py"]}
})
tools = client.get_tools()
llm = ChatOpenAI(model="gpt-4o")
agent = create_react_agent(llm, tools)
result = agent.invoke({"messages": [{"role": "user", "content": "Read /tmp/config.yaml and summarise key settings"}]})Production Best Practices
Server Health Checks
import asyncio
from mcp import ClientSession
from mcp.client.stdio import stdio_client
class MCPClientWithHealthCheck:
def __init__(self, server_params, health_check_timeout: float = 5.0):
self.server_params = server_params
self.health_check_timeout = health_check_timeout
self._session = None
async def health_check(self) -> bool:
try:
async with stdio_client(self.server_params) as (read, write):
session = ClientSession(read, write)
await asyncio.wait_for(session.initialize(), timeout=self.health_check_timeout)
await session.list_tools()
return True
except asyncio.TimeoutError:
return False
except Exception as e:
print(f"Health check failed: {e}")
return False
async def get_session(self) -> ClientSession:
if self._session is None:
async with stdio_client(self.server_params) as (read, write):
self._session = ClientSession(read, write)
await self._session.initialize()
return self._sessionError Handling and Retries
import asyncio
from mcp.error import MCPError
class ResilientMCPClient:
def __init__(self, server_params, max_retries: int = 3):
self.server_params = server_params
self.max_retries = max_retries
async def call_with_retry(self, tool_name: str, arguments: dict) -> dict:
last_error = None
for attempt in range(self.max_retries):
try:
async with stdio_client(self.server_params) as (read, write):
session = ClientSession(read, write)
await session.initialize()
result = await session.call_tool(tool_name, arguments)
return result.content[0].model_dump()
except MCPError as e:
raise RuntimeError(f"MCP protocol error: {e}")
except Exception as e:
last_error = e
if attempt < self.max_retries - 1:
await asyncio.sleep(2 ** attempt) # exponential back‑off
continue
raise RuntimeError(f"Failed after {self.max_retries} attempts: {last_error}")Security Considerations
@mcp.tool()
def delete_file(path: str) -> dict:
"""Delete a file with strict safety checks"""
import os
forbidden_paths = ["/", "/etc", "/usr", "/bin", "/sbin", "/var"]
abs_path = os.path.abspath(path)
for forbidden in forbidden_paths:
if abs_path.startswith(forbidden):
raise ValueError(f"Cannot delete files in {forbidden}")
if ".." in path:
raise ValueError("Path traversal not allowed")
if not os.path.exists(abs_path):
return {"success": False, "error": "File not found"}
os.remove(abs_path)
return {"success": True, "path": path}Ecosystem and Future Directions
Official and community MCP servers cover file‑system access, Slack messaging, GitHub API, browser automation (Puppeteer), PostgreSQL queries, and web search (Brave Search). Configuration files (e.g., claude_desktop_config.json) declare each server’s command, arguments, and environment variables, enabling plug‑and‑play deployment.
Protocol Evolution
Future work includes adding true bidirectional streaming (enhanced sampling / prompt caching) and richer multi‑agent collaboration, where several AI agents share a single MCP server, the server arbitrates resource conflicts, and messages can be broadcast between agents.
# Multi‑agent shared bridge
class SharedMCPBridge:
def __init__(self, server):
self.server = server
self.agents = {}
def register_agent(self, agent_id: str, client_session):
self.agents[agent_id] = client_session
async def broadcast(self, message: dict, from_agent: str):
"""Broadcast a message to all agents except the sender"""
for agent_id, session in self.agents.items():
if agent_id != from_agent:
await session.send_notification("agent/message", message)Conclusion
MCP provides a standardized, ecosystem‑friendly way for AI models to call tools, access resources, and exchange data, dramatically reducing duplicated integration work. Production‑grade deployments benefit from health‑checking, retry logic, and strict security validation, while the open architecture invites community‑driven extensions and future protocol enhancements.
MaGe Linux Operations
Founded in 2009, MaGe Education is a top Chinese high‑end IT training brand. Its graduates earn 12K+ RMB salaries, and the school has trained tens of thousands of students. It offers high‑pay courses in Linux cloud operations, Python full‑stack, automation, data analysis, AI, and Go high‑concurrency architecture. Thanks to quality courses and a solid reputation, it has talent partnerships with numerous internet firms.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
