Kickstart MCP in Minutes: Build a Python SSE Demo and Client

This guide walks you through installing dependencies, creating two MCP‑based SSE servers and a client in Python, explains MCP concepts and architecture, and shows how to run the demo to explore the Model Context Protocol for AI agents.

JD Tech Talk
JD Tech Talk
JD Tech Talk
Kickstart MCP in Minutes: Build a Python SSE Demo and Client

0.1 Environment Dependencies

Python >= 3.10 is required. Install the following packages:

Package               Version
-----------------    ---------
annotated-types      0.7.0
anyio                4.9.0
certifi              2025.4.26
click                8.2.1
h11                  0.16.0
httpcore             1.0.9
httpx                0.28.1
httpx-sse            0.4.0
idna                 3.10
mcp                  1.9.2
pydantic             2.11.5
pydantic-core        2.33.2
pydantic-settings    2.9.1
python-dotenv        1.1.0
python-multipart     0.0.20
sniffio              1.3.1
sse-starlette        2.3.6
starlette            0.47.0
typing-extensions    4.14.0
typing-inspection    0.4.1
uvicorn              0.34.3

0.2 Implementing mcp‑server

Server 1 – Two tools

#!/usr/bin/env python
# -*- coding:UTF-8 -*-
import sys
import httpx
import asyncio
from mcp.server import FastMCP
app = FastMCP('web-search', port=9000)

@app.tool()
async def web_search(query: str) -> str:
    """Search the internet for the given query and return a summary."""
    res_data = ["query", query]
    return ':'.join(res_data)

@app.tool()
async def hello_world() -> str:
    """Return a greeting string."""
    return "Hi mcp world!"

if __name__ == "__main__":
    app.run(transport='sse')

Server 2 – One tool

#!/usr/bin/env python
# -*- coding:UTF-8 -*-
import sys
import httpx
import asyncio
from mcp.server import FastMCP
app = FastMCP('test-sse', port=9001)

@app.tool()
async def hi_world() -> str:
    """Return a greeting in Chinese."""
    return "你好 mcp world!"

if __name__ == "__main__":
    app.run(transport='sse')

0.3 Implementing mcp‑client

#!/usr/bin/env python
# -*- coding:UTF-8 -*-
import sys
import asyncio
from contextlib import AsyncExitStack
from mcp.client.sse import sse_client
from mcp import ClientSession

class MCPClient:
    def __init__(self, server_urls: list[str]):
        """Initialize MCP client with a list of SSE server URLs."""
        self.server_urls = server_urls
        self.sessions = {}
        self.tool_mapping = {}
        self.exit_stack = AsyncExitStack()

    async def initialize_sessions(self):
        for i, server_url in enumerate(self.server_urls):
            server_id = f"server{i}"
            streams_context = sse_client(url=server_url)
            streams = await self.exit_stack.enter_async_context(streams_context)
            session = await self.exit_stack.enter_async_context(ClientSession(*streams))
            await session.initialize()
            self.sessions[server_id] = session
            response = await session.list_tools()
            for tool in response.tools:
                prefixed_name = f"{server_id}_{tool.name}"
                self.tool_mapping[prefixed_name] = (session, tool.name)
            print(f"Connected to {server_url}, tools: {[t.name for t in response.tools]}")

    async def chat_loop(self):
        for _ in range(5):
            try:
                for prefixed_name, (session, tool_name) in self.tool_mapping.items():
                    if tool_name == "web_search":
                        res = await session.call_tool('web_search', {'query': '杭州今天天气'})
                    elif tool_name == "hello_world":
                        res = await session.call_tool('hello_world')
                    elif tool_name == "hi_world":
                        res = await session.call_tool('hi_world')
                    else:
                        res = "tool not found!"
                    print(tool_name + ": ")
                    print(res)
            except Exception as e:
                import traceback
                traceback.print_exc()

    async def cleanup(self):
        """Close all sessions and release resources."""
        await self.exit_stack.aclose()
        print("All sessions cleaned up.")

async def main():
    server_urls = ["http://localhost:9000/sse", "http://localhost:9001/sse"]
    client = MCPClient(server_urls=server_urls)
    try:
        await client.initialize_sessions()
        await client.chat_loop()
    finally:
        await client.cleanup()

if __name__ == '__main__':
    asyncio.run(main())

0.4 Run the demo

# Start server‑1
python sse_server_1.py
# Start server‑2
python sse_server_2.py
# Run the client
python sse_cli.py

1 What is MCP?

Official definition

MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to large language models (LLMs). Think of it as a USB‑C interface for AI applications, enabling standardized connections between models and various data sources or tools.

Why choose MCP?

Continuously growing list of pre‑built integrations that LLMs can use directly.

Flexible switching between different LLM providers.

Best practices for securely handling data within your own infrastructure.

General architecture

MCP follows a client‑server model where a host application can connect to multiple servers.

MCP architecture diagram
MCP architecture diagram

Key components

MCP Hosts : Applications like Claude Desktop, IDEs, or AI tools that want to access data via MCP.

MCP Clients : Protocol clients that maintain a one‑to‑one connection with a server.

MCP Servers : Lightweight programs that expose specific capabilities through the Model Context Protocol.

Local data sources : Files, databases, or services that servers can safely access.

Remote services : External systems reachable over the internet via APIs.

MCP core concepts

Servers can provide three main types of capabilities:

Resources : Read‑only data such as API responses or file contents.

Tools : Functions that LLMs can invoke (with user approval).

Prompts : Pre‑written templates that guide users to complete specific tasks.

1.2 Interpreting MCP

MCP acts as a bridging layer that supports both local‑process communication and remote network communication, standardizing the interaction between large models and the resources they depend on.

2 Origin of MCP

The protocol was created to unify how LLMs access resources, tools, and prompts, fostering an open ecosystem where capabilities are shared under a gentleman’s agreement.

2.1 Supported resource types

MCP aligns with the three application modes of large models, aiming to simplify the LLM‑driven technology era.

MCP resource types
MCP resource types

2.2 Future outlook

The underlying assumption is that everyone will open‑source capabilities and honor the gentleman’s agreement, which may be optimistic and could introduce risks.

References

https://mcp-docs.cn/introduction

QR code
QR code

Scan to join the technical community.

PythonAI agentsMCPSSEclient-server
JD Tech Talk
Written by

JD Tech Talk

Official JD Tech public account delivering best practices and technology innovation.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.