Quickly Master MCP: Build a Python SSE Server and Client in Minutes

This guide introduces the Model Context Protocol (MCP), explains its purpose as a standardized USB‑C‑like interface for LLMs, and provides step‑by‑step Python code to set up SSE‑based MCP servers and a client, plus essential installation details and execution commands.

JD Cloud Developers
JD Cloud Developers
JD Cloud Developers
Quickly Master MCP: Build a Python SSE Server and Client in Minutes

Quick MCP Overview

Spend a few minutes to understand MCP, a lightweight open protocol that standardizes how applications provide context to large language models (LLMs), similar to a USB‑C interface for AI.

0.1 Environment Dependencies

Package               Version
-----------------     ---------
annotated-types      0.7.0
anyio                4.9.0
certifi              2025.4.26
click                8.2.1
h11                  0.16.0
httpcore             1.0.9
httpx                0.28.1
httpx-sse            0.4.0
idna                 3.10
mcp                  1.9.2
pydantic             2.11.5
pydantic-core        2.33.2
pydantic-settings    2.9.1
python-dotenv        1.1.0
python-multipart     0.0.20
sniffio              1.3.1
sse-starlette        2.3.6
starlette            0.47.0
typing-extensions    4.14.0
typing-inspection    0.4.1
uvicorn              0.34.3

0.2 Implementing MCP Server

Server 1 – Two Tools

#!/usr/bin/env python
# -*- coding:UTF-8 -*-
#########################################################################
# File Name: sse_server_1.py
# Author: AI_Team
# Mail: [email protected]
# Created Time: 17:19:01 2025-06-04
#########################################################################
import sys
import httpx
import asyncio
from mcp.server import FastMCP

app = FastMCP('web-search', port=9000)

@app.tool()
async def web_search(query: str) -> str:
    """Search internet content.
    Args:
        query: content to search
    Returns:
        Summary of search results
    """
    res_data = ["query", query]
    return ':'.join(res_data)

@app.tool()
async def hello_world() -> str:
    """Greet the world.
    Returns:
        Greeting string
    """
    return "Hi mcp world!"

if __name__ == "__main__":
    app.run(transport='sse')

Server 2 – One Tool

#!/usr/bin/env python
# -*- coding:UTF-8 -*-
#########################################################################
# File Name: sse_server_2.py
# Author: AI_Team
# Mail: [email protected]
# Created Time: 17:19:01 2025-06-04
#########################################################################
import sys
import httpx
import asyncio
from mcp.server import FastMCP

app = FastMCP('test-sse', port=9001)

@app.tool()
async def hi_world() -> str:
    """Another way to greet the world, returns Chinese text."""
    return "你好 mcp world!"

if __name__ == "__main__":
    app.run(transport='sse')

0.3 Implementing MCP Client

#!/usr/bin/env python
# -*- coding:UTF-8 -*-
#########################################################################
# File Name: sse_cli.py
# Author: AI_Team
# Mail: [email protected]
# Created Time: 18:43:37 2025-06-04
#########################################################################
import sys
import asyncio
from contextlib import AsyncExitStack
from mcp.client.sse import sse_client
from mcp import ClientSession

class MCPClient:
    def __init__(self, server_urls: list[str]):
        """Initialize MCP client with a list of SSE server URLs."""
        self.server_urls = server_urls
        self.sessions = {}
        self.tool_mapping = {}
        self.exit_stack = AsyncExitStack()

    async def initialize_sessions(self):
        """Connect to all SSE servers and retrieve available tools."""
        for i, server_url in enumerate(self.server_urls):
            server_id = f"server{i}"
            streams_context = sse_client(url=server_url)
            streams = await self.exit_stack.enter_async_context(streams_context)
            session = await self.exit_stack.enter_async_context(ClientSession(*streams))
            await session.initialize()
            self.sessions[server_id] = session
            response = await session.list_tools()
            for tool in response.tools:
                prefixed_name = f"{server_id}_{tool.name}"
                self.tool_mapping[prefixed_name] = (session, tool.name)
            print(f"Connected to {server_url}, tools: {[t.name for t in response.tools]}")

    async def chat_loop(self):
        for _ in range(5):
            try:
                for prefixed_name, (session, tool_name) in self.tool_mapping.items():
                    if tool_name == "web_search":
                        res = await session.call_tool('web_search', {'query': '杭州今天天气'})
                    elif tool_name == "hello_world":
                        res = await session.call_tool('hello_world')
                    elif tool_name == "hi_world":
                        res = await session.call_tool('hi_world')
                    else:
                        res = "tool不存在!"
                    print(tool_name + ": ")
                    print(res)
            except Exception as e:
                import traceback
                traceback.print_exc()

    async def cleanup(self):
        """Close all sessions and release resources."""
        await self.exit_stack.aclose()
        print("所有会话已清理。")

async def main():
    server_urls = ["http://localhost:9000/sse", "http://localhost:9001/sse"]
    client = MCPClient(server_urls=server_urls)
    try:
        await client.initialize_sessions()
        await client.chat_loop()
    finally:
        await client.cleanup()

if __name__ == '__main__':
    asyncio.run(main())

0.4 Run the Demo

# Start server‑1
python sse_server_1.py
# Start server‑2
python sse_server_2.py
# Run the client
python sse_cli.py

What Is MCP?

MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs, acting like a USB‑C interface for AI applications, enabling seamless connection of data sources and tools.

Why Choose MCP?

Continuously growing list of pre‑built integrations that LLMs can use directly.

Flexible switching between different LLM providers.

Best practices for securely handling data within your own infrastructure.

General Architecture

The core of MCP follows a client‑server model where host applications can connect to multiple servers.

MCP architecture diagram
MCP architecture diagram

Key Components

MCP Hosts : Applications like Claude Desktop, IDEs, or AI tools that want to access data via MCP.

MCP Clients : Protocol clients that maintain a one‑to‑one connection with a server.

MCP Servers : Lightweight programs exposing capabilities through the Model Context Protocol.

Local Data Sources : Files, databases, or services that servers can safely access.

Remote Services : External systems reachable via APIs.

MCP Core Concepts

Servers can provide three main types of capabilities:

Resources : Read‑only data such as API responses or file contents.

Tools : Functions that LLMs can invoke (with user approval).

Prompts : Pre‑written templates that guide users to complete specific tasks.

Interpretation

MCP acts as a bridge layer supporting both local process communication and remote network communication, normalizing the relationship between large models and the resources they depend on.

MCP bridge illustration
MCP bridge illustration

Future Outlook

The underlying logic of MCP assumes open‑source capabilities and a gentlemen's agreement among participants, which may introduce risks if the assumption does not hold.

References

https://mcp-docs.cn/introduction

QR code for technical community
QR code for technical community

Scan the QR code to join the technical discussion group.

PythonMCPLLM integrationSSEclient-serverAI protocol
JD Cloud Developers
Written by

JD Cloud Developers

JD Cloud Developers (Developer of JD Technology) is a JD Technology Group platform offering technical sharing and communication for AI, cloud computing, IoT and related developers. It publishes JD product technical information, industry content, and tech event news. Embrace technology and partner with developers to envision the future.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.