Rapidly Build a Streamable HTTP MCP Server with the Official MCP SDK – Full End‑to‑End Guide

This article walks through the complete process of creating, testing, and publishing a streamable HTTP MCP server using the official MCP SDK, covering environment setup with Anaconda and uv, project structuring, code implementation, tool integration, Inspector testing, PyPI deployment, and client verification with CherryStudio.

Fun with Large Models
Fun with Large Models
Fun with Large Models
Rapidly Build a Streamable HTTP MCP Server with the Official MCP SDK – Full End‑to‑End Guide

Environment Setup

Create an Anaconda virtual environment named mcp and activate it. Install the Rust‑based Python package manager uv, initialise a project, create a virtual environment inside the project, and install required libraries.

conda create -n mcp python=3.12
conda activate mcp
pip install uv
uv init mcp-streamable-weather
cd mcp-streamable-weather
uv venv
# Linux/macOS
source .venv/bin/activate
# Windows
.venv\Scripts\activate
uv add mcp requests

Project Layout

The repository is hosted at https://github.com/TangBaron/mcp-streamable-weather and follows a src_layer layout. Create the package directory src/mcp_weather_http and three source files: __init__.py, __main__.py, and server.py.

mkdir -p ./src/mcp_weather_http
cd ./src/mcp_weather_http
# create __init__.py, __main__.py, server.py

Server Implementation

server.py

imports the required modules, defines an asynchronous function fetch_weather that calls the Seniverse weather API, and registers a tool get-weather with the MCP SDK. The command‑line interface is built with click. Logging is configured via the standard logging module. A Server instance is created, the tool is exposed with @app.call_tool(), and the list of tools is provided with @app.list_tools(). A StreamableHTTPSessionManager handles HTTP requests in stateless mode (no history) and streams responses using Server‑Sent Events (SSE) unless --json-response is set. The ASGI application is built with Starlette, mounted at /mcp, and started with uvicorn.

import requests, logging, json, click, contextlib, uvicorn
import mcp.types as types
from collections.abc import AsyncIterator
from mcp.server.lowlevel import Server
from mcp.server.streamable_http_manager import StreamableHTTPSessionManager
from starlette.applications import Starlette
from starlette.routing import Mount

async def fetch_weather(city: str, api_key):
    try:
        url = "https://api.seniverse.com/v3/weather/now.json"
        params = {"key": api_key, "location": city, "language": "zh-Hans", "unit": "c"}
        response = requests.get(url, params=params)
        temperature = response.json()["results"][0]["now"]
    except Exception:
        return "error"
    return json.dumps(temperature)

@click.command()
@click.option("--port", default=3000, help="Port to listen on for HTTP")
@click.option("--api-key", required=True, help="心知天气 API key")
@click.option("--log-level", default="INFO", help="日志级别(DEBUG, INFO, WARNING, ERROR, CRITICAL)")
@click.option("--json-response", is_flag=True, default=False, help="使用 JSON 响应代替 SSE 流式输出")
def main(port, api_key, log_level, json_response):
    logging.basicConfig(level=getattr(logging, log_level.upper()),
                        format="%(asctime)s - %(name)s - %(levelname)s - %(message)s")
    logger = logging.getLogger("weather-server")
    app = Server("Weather-Streamable-HTTP-MCP-Server")

    @app.call_tool()
    async def call_tool(name, arguments):
        ctx = app.request_context
        city = arguments.get("city")
        if not city:
            raise ValueError("'city' is required in arguments")
        await ctx.session.send_log_message(level="info", data=f"Fetching weather for {city}…",
                                          logger="weather", related_request_id=ctx.request_id)
        try:
            weather = await fetch_weather(city, api_key)
        except Exception as err:
            await ctx.session.send_log_message(level="error", data=str(err),
                                              logger="weather", related_request_id=ctx.request_id)
            raise
        await ctx.session.send_log_message(level="info", data="Weather data fetched successfully!",
                                          logger="weather", related_request_id=ctx.request_id)
        return [types.TextContent(type="text", text=weather)]

    @app.list_tools()
    async def list_tools():
        return [types.Tool(name="get-weather",
                           description="查询指定城市的实时天气(心知天气数据)",
                           inputSchema={"type": "object", "required": ["city"],
                                        "properties": {"city": {"type": "string",
                                                                 "description": "城市名称,如 '北京'"}}})]

    session_manager = StreamableHTTPSessionManager(app=app, event_store=None,
                                                   json_response=json_response, stateless=True)

    async def handle_streamable_http(scope, receive, send):
        await session_manager.handle_request(scope, receive, send)

    @contextlib.asynccontextmanager
    async def lifespan(app):
        async with session_manager.run():
            logger.info("Weather MCP server started! 🚀")
            try:
                yield
            finally:
                logger.info("Weather MCP server shutting down…")

    starlette_app = Starlette(debug=False,
                               routes=[Mount("/mcp", app=handle_streamable_http)],
                               lifespan=lifespan)
    uvicorn.run(starlette_app, host="0.0.0.0", port=port)
    return 0

if __name__ == "__main__":
    main()

Running and Testing the Server

Start the service with a valid Seniverse API key:

uv run ./src/mcp_weather_http/server.py --api-key YOUR_API_KEY

Open the MCP Inspector (provided by the official SDK), select HTTP Streamable mode, and point it to http://localhost:3000/mcp. The get-weather tool appears; invoking it with a city name returns the real‑time weather JSON, confirming correct operation.

Packaging and Publishing

After successful testing, build the distribution and upload it to PyPI so that it can be installed with pip:

uv pip install build twine   # install packaging tools
python -m build               # build the distribution
python -m twine upload dist/* # upload to PyPI

The pyproject.toml contains:

[project]
name = "mcp-streamable-weather"
version = "1.1.0"
description = "输入心知天气-API-KEY,获取实时天气信息。"
requires-python = ">=3.12"
dependencies = [
    "mcp>=1.9.0",
    "requests>=2.32.3",
]

[project.scripts]
mcp-streamable-weather = "mcp_weather_http:main"

[tool.setuptools]
package-dir = {"" = "src"}

[tool.setuptools.packages.find]
where = ["src"]

After upload, the package can be installed with pip install mcp-streamable-weather.

Verification with CherryStudio

Install the package locally and run the entry point:

uv run mcp-streamable-weather --api-key YOUR_API_KEY

In CherryStudio, add a new MCP server with the name mcp-streamable-weather, type “Streamable HTTP”, and URL http://localhost:3000/mcp. After saving, the client recognises the server and can invoke the get-weather tool, returning correct weather information for queries such as “今天北京天气”.

PythonMCPPyPIuvASGIStreamable HTTPCherryStudio
Fun with Large Models
Written by

Fun with Large Models

Master's graduate from Beijing Institute of Technology, published four top‑journal papers, previously worked as a developer at ByteDance and Alibaba. Currently researching large models at a major state‑owned enterprise. Committed to sharing concise, practical AI large‑model development experience, believing that AI large models will become as essential as PCs in the future. Let's start experimenting now!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.