Unlocking AI Integration: A Hands‑On Guide to the Model Context Protocol (MCP)
The article introduces the Model Context Protocol (MCP), an open Anthropic standard that creates a secure, standardized, bidirectional bridge between large language models and external tools, then walks through its architecture, core components, Python server and client code, OpenAI integration, usage flow, ecosystem and future outlook.
The Model Context Protocol (MCP) is an open protocol released by Anthropic that aims to provide a safe, standardized, bidirectional connection channel between large language models (LLMs) and external data sources or tools.
What Is MCP?
Analogous to a USB‑C interface for AI, MCP defines a universal standard so that any AI application supporting MCP (e.g., Cursor IDE, Claude Desktop) can plug into various external tools—databases, APIs, local files—without writing custom adapters.
Architecture and Core Components
MCP follows a classic client‑server architecture. The main components and their responsibilities are:
MCP Host (宿主) : The AI application that users interact with, such as Cursor IDE or Claude Desktop. It acts as the "brain" that initiates all requests.
MCP Client (客户端) : Embedded inside the Host, it serves as the protocol translator. Each Client maintains a dedicated one‑to‑one connection with a Server and handles routing and conversion of messages.
MCP Server (服务器) : A lightweight service exposing three core capabilities:
Tool : Executable functions (e.g., send email, query a database).
Resource : Read‑only data sources such as file contents, database records, or API responses.
Prompt : Pre‑defined dialogue templates that guide the LLM to perform specific workflows.
MCP Protocol (协议) : Uses JSON‑RPC 2.0 message format and supports both Stdio (local) and HTTP (remote) transports.
Building an MCP Server (Python)
The following minimal example creates an MCP server that offers a simple greet tool returning a greeting string.
from fastmcp import FastMCP
mcp = FastMCP("My MCP Server")
@mcp.tool
def greet(name: str) -> str:
return f"Hello, {name}!"
if __name__ == "__main__":
mcp.run(transport="http", port=8000)Calling the Server from a Client
A client can list available tools and invoke the greet tool as shown below.
import asyncio
from fastmcp import Client
client = Client("http://localhost:8000/mcp")
async def list_tools():
async with client:
tools = await client.list_tools()
print(tools)
async def call_tool(name: str):
async with client:
result = await client.call_tool("greet", {"name": name})
print(result)
asyncio.run(call_tool("Ford"))OpenAI MCP Invocation
Using the OpenAI Python SDK, an LLM can call the same MCP service via a tool definition.
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
model="gpt-5",
tools=[
{
"type": "mcp",
"server_label": "dmcp",
"server_description": "Demo MCP Server",
"server_url": "http://localhost:8000/mcp",
"require_approval": "never",
},
],
input="Hello, MCP!",
)
print(resp.output_text)Typical Interaction Flow
List the tools and resources available in the MCP service.
Select the tool or resource needed for the task.
Construct a request and send it to the MCP service.
Process the result returned by the service.
Incorporate the result back into the LLM conversation.
Ecosystem
Popular MCP Hosts : Cursor IDE, Claude Desktop, OpenWebUI – add an MCP Server in their settings to extend functionality.
MCP Server Repositories : Official examples at github.com/modelcontextprotocol/servers and community directory cursor.directory offering tools ranging from file system access to cloud services.
Development SDKs : Anthropic provides official Python and TypeScript SDKs that simplify server and client development. Documentation: https://github.com/modelcontextprotocol
Cold Knowledge
Inspiration : MCP draws heavily from the Language Server Protocol (LSP), which standardized IDE support for multiple programming languages.
Permission Model : Tools are invoked by the model, while access to resources is fully controlled by the user, preventing exposure of API keys to LLM providers.
Commercial Outlook : The protocol shows strong potential in consumer‑facing (To C) scenarios such as smart hardware and social apps, but may face adoption challenges in enterprise (To B) contexts where software prefers to be the entry point rather than a callable tool.
References
Anthropic MCP official documentation – https://www.anthropic.com/news/model-context-protocol
Model Context Protocol (MCP) quick‑start guide – https://github.com/liaokongVFX/MCP-Chinese-Getting-Started-Guide
FastMCP official documentation – https://gofastmcp.com/getting-started/quickstart
OpenAI MCP guide – https://platform.openai.com/docs/guides/tools-connectors-mcp
Qborfy AI
A knowledge base that logs daily experiences and learning journeys, sharing them with you to grow together.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
