Cloud Native 21 min read

Deploying MCP Server on Serverless Cloud Functions with Cube Secure Containers

The article explains how to deploy a Model Context Protocol (MCP) server—illustrated with a Python weather‑query example—on Tencent Cloud Function using either a Docker image or direct code upload, leverages Cube’s high‑security lightweight containers for fast start‑up, and highlights serverless benefits such as automatic scaling, cost efficiency, and simplified operations compared with Kubernetes for AI agents and tool integration.

Tencent Technical Engineering
Tencent Technical Engineering
Tencent Technical Engineering
Deploying MCP Server on Serverless Cloud Functions with Cube Secure Containers

The Model Context Protocol (MCP) standardizes the interface between large language models (LLMs) and external tools, enabling seamless integration of AI agents. Combining MCP with a Serverless architecture provides elastic compute resources that can dynamically scale to meet the demands of AI agents, such as real‑time scheduling of thousands of agents within an enterprise.

1. MCP Overview

MCP, promoted by Anthropic, acts like a USB‑C port for AI applications, allowing models to connect to various data sources and tools through a unified protocol. OpenAI’s recent Agent SDK update officially supports MCP, facilitating rapid integration of multiple tools.

2. Why Use MCP

Plugin‑style integration of datasets or tools.

Decouples tools from LLMs, enabling multi‑vendor switching.

Preserves data privacy by keeping data on‑premises.

3. MCP Architecture

The core is a client‑server model with hosts (e.g., Claude Desktop, IDE), MCP clients, MCP servers, local data sources, and remote services.

4. Example: Weather Query MCP Server

The following Python code implements a FastMCP server that queries a weather API. The server listens on port 9000, which is required for both function images and web code deployments.

from mcp.server.fastmcp import FastMCP
import os
import logging
import httpx
import json

# Initialize FastMCP server
mcp = FastMCP("weather", host="0.0.0.0", port=9000)

# Constants
NWS_API_BASE = "api url"
USER_AGENT = "weather-app/1.0"
API_KEY = "api key"

@mcp.tool()
def get_weather(city: str) -> str:
    """Fetch weather for a city.
    Args:
        city: City name
    """
    try:
        client = httpx.Client(verify=True)
        params = {"key": API_KEY, "city": city, "output": "json"}
        response = client.get("https://apis.map.qq.com/ws/weather/v1/", params=params, timeout=10)
        logging.info(f"Status Code: {response.status_code}")
        logging.info(f"Response: {response.text}")
        weather_data = response.json()
        if weather_data.get("status") != 0:
            return f"获取天气信息失败: {weather_data.get('message', '未知错误')}"
        data = weather_data.get("result", {})
        observe = data.get("realtime", {})
        infos = data.get("infos", {})
        if not observe:
            return "无法获取天气信息: 数据为空"
        weather_info = f"""
            天气: {infos.get('weather', '')}
            温度: {infos.get('temperature', '')}°C
            湿度: {infos.get('humidity', '')}%
            风力: {infos.get('wind_power', '')}级
        """
        return weather_info
    except httpx.HTTPError as e:
        error_msg = f"HTTP请求失败: {str(e)}"
        logging.error(error_msg)
        return error_msg
    except Exception as e:
        error_msg = f"获取天气信息失败: {str(e)}"
        logging.error(error_msg)
        return error_msg
    finally:
        if 'client' in locals():
            client.close()

if __name__ == '__main__':
    logging.basicConfig(level=logging.INFO)
    mcp.run(transport='sse')

Deploy the server to Tencent Cloud Function (SCF) either via a Docker image or direct code upload.

5. Deploying via Docker Image

# Use official Python 3.13 slim image
FROM python:3.13.2-slim
WORKDIR /app
COPY . /app
RUN pip install --no-cache-dir .
EXPOSE 9000
CMD ["python", "weather.py"]

Push the built image to TCR, then create a SCF using the container image, selecting the Web Function type, setting a longer timeout (e.g., 120 s) and enabling high concurrency.

6. Deploying via Code

Upload the same Python script directly; the deployment is faster because no image build is required. The SCF CLI can further accelerate deployment.

7. Benefits of Cloud Functions vs. Kubernetes

Serverless functions offer zero‑infrastructure management, automatic scaling, pay‑per‑use billing, rapid start‑up, event‑driven integration, and simplified CI/CD, making them ideal for lightweight, bursty workloads. Kubernetes provides more control for large‑scale, long‑running services but incurs higher operational overhead.

8. Cube Secure Containers

Cube provides high‑security, lightweight containers that combine the isolation of VMs with the speed of containers. They are well‑suited for AI agents and MCP servers requiring fast start‑up (< 100 ms) and high concurrency (e.g., 0.1 C 64 M instances).

9. AI on Serverless

Deploying MCP and AI agents on Serverless yields elastic scaling, cost efficiency, and simplified operations. Scenarios include dynamic AI agent generation, tool integration, database access, log analysis, and web‑automation via Puppeteer.

10. Application Scenarios

Database‑driven MCP servers for data analysis.

Cloud‑API‑driven resource management.

CLS‑based log analysis.

Cloud monitoring and system health checks.

Job scheduling for AI agents via SCF.

Puppeteer‑based web crawling and automation.

serverlessPythonAI agentsMCPDeploymentCloud FunctionsCube Secure Container
Tencent Technical Engineering
Written by

Tencent Technical Engineering

Official account of Tencent Technology. A platform for publishing and analyzing Tencent's technological innovations and cutting-edge developments.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.