MCP Explained: The Revolutionary Protocol for Large‑Model Real‑World Interaction

The article analyzes the Model Context Protocol (MCP) introduced by Anthropic, detailing its standardized, distributed architecture that overcomes the limitations of Function Calling, and provides step‑by‑step Python examples for building MCP client and server, demonstrating how large language models can reliably interact with external tools.

Fun with Large Models
Fun with Large Models
Fun with Large Models
MCP Explained: The Revolutionary Protocol for Large‑Model Real‑World Interaction

Background

Function Calling (OpenAI 2023) enables large language models to invoke external tools via a mediator function, but each function requires extensive boilerplate: hundreds of lines of code, a JSON Schema description, and a custom prompt. This creates a “Babel tower” where implementations diverge.

Anthropic released Model Context Protocol (MCP) in November 2024 as a protocol that standardizes the interface between AI agents (clients) and external services (servers), separating client and server responsibilities.

Technical Architecture and Core Advantages

Standardized Interface

MCP defines a single calling convention covering API definition, error handling, parameters, return values and documentation. Once a server implements the MCP specification, any client can invoke its functions without additional adapters. Example: an MCP‑based weather service exposing get_weather() can be called by any MCP‑compatible client.

Distributed and Asynchronous Design

Unlike Function Calling’s monolithic synchronous model, MCP uses a distributed architecture that decouples tool providers (servers) from tool consumers (clients). MCP natively supports asynchronous execution, allowing long‑running tasks to run in the background and improving throughput for concurrent requests.

Ecosystem Support

Anthropic provides SDKs for Python, TypeScript and Java. Installing the MCP library enables creation of a server with a few lines of code and access to public MCP servers such as Baidu Maps.

Python MCP Development Walk‑through

Environment Preparation

Use uv to create a virtual environment and install the MCP SDK:

pip install uv
uv init mcp-demo
cd mcp-demo

Activate the environment:

.venv\Scripts\activate   # Windows
source .venv/bin/activate   # Linux/macOS

Writing the MCP Server

Create server.py and define a simple addition tool:

from mcp.server.fastmcp import FastMCP

mcp = FastMCP('Demo')

@mcp.tool()
def add(a: int, b: int) -> int:
    """Calculate the sum of two integers and return it"""
    return a + b

if __name__ == "__main__":
    mcp.run(transport='stdio')

The @mcp.tool() decorator automatically generates the JSON Schema required for the model to recognize and invoke the function.

Writing the MCP Client

Create client.py. The client initializes an OpenAI‑compatible SDK, connects to the server via standard I/O, lists available tools, and runs an interactive chat loop:

import asyncio, json
from typing import Optional
from contextlib import AsyncExitStack
from openai import OpenAI
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

class MCPClient:
    def __init__(self):
        self.exit_stack = AsyncExitStack()
        self.opanai_api_key = "YOUR_API_KEY"
        self.base_url = "https://api.deepseek.com"
        self.model = "deepseek-chat"
        self.client = OpenAI(api_key=self.opanai_api_key, base_url=self.base_url)
        self.session: Optional[ClientSession] = None

    async def connect_to_server(self, server_script_path):
        server_params = StdioServerParameters(
            command="python",
            args=[server_script_path],
            env=None,
        )
        stdio_transport = await self.exit_stack.enter_async_context(stdio_client(server_params))
        self.stdio, self.write = stdio_transport
        self.session = await self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write))
        await self.session.initialize()
        response = await self.session.list_tools()
        tools = response.tools
        print("
Connected to server, available tools:", [t.name for t in tools])

    async def process_query(self, query: str) -> str:
        messages = [{"role": "user", "content": query}]
        response = await self.session.list_tools()
        available_tools = [{
            "type": "function",
            "function": {
                "name": tool.name,
                "description": tool.description,
                "input_schema": tool.inputSchema,
            }
        } for tool in response.tools]
        response = self.client.chat.completions.create(
            model=self.model,
            messages=messages,
            tools=available_tools,
        )
        content = response.choices[0]
        if content.finish_reason == "tool_calls":
            tool_call = content.message.tool_calls[0]
            tool_name = tool_call.function.name
            tool_args = json.loads(tool_call.function.arguments)
            result = await self.session.call_tool(tool_name, tool_args)
            messages.append(content.message.model_dump())
            messages.append({
                "role": "tool",
                "content": result.content[0].text,
                "tool_call_id": tool_call.id,
            })
            response = self.client.chat.completions.create(model=self.model, messages=messages)
            return response.choices[0].message.content
        return content.message.content

    async def chat_loop(self):
        print("
MCP client started! Type 'quit' to exit")
        while True:
            query = input("
User: ").strip()
            if query.lower() == 'quit':
                break
            response = await self.process_query(query)
            print(f"
Model: {response}")

    async def clean(self):
        await self.exit_stack.aclose()

async def main():
    import sys
    if len(sys.argv) < 2:
        print("Usage: python client.py server.py")
        sys.exit(1)
    client = MCPClient()
    await client.connect_to_server(sys.argv[1])
    await client.chat_loop()
    await client.clean()

if __name__ == "__main__":
    import sys, asyncio
    asyncio.run(main())

The client code does not bind to any specific server function, allowing reuse with any MCP server.

Testing the Setup

Activate the virtual environment and run:

.venv\Scripts\activate
python client.py server.py

The model calls the add function on the server and returns the computed sum, demonstrating reliable tool invocation.

Future Outlook

MCP aims to become a foundational protocol for AI agents similar to TCP/IP and HTTP for the Internet, removing the “Babel tower” barrier and enabling scalable real‑world AI applications.

PythonMCPModel Context Protocol
Fun with Large Models
Written by

Fun with Large Models

Master's graduate from Beijing Institute of Technology, published four top‑journal papers, previously worked as a developer at ByteDance and Alibaba. Currently researching large models at a major state‑owned enterprise. Committed to sharing concise, practical AI large‑model development experience, believing that AI large models will become as essential as PCs in the future. Let's start experimenting now!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.