Cloud Native 15 min read

Deploy OpenClaw Autonomous Agents on Alibaba Cloud SAE – Full Step‑by‑Step Guide

This guide walks you through deploying the OpenClaw autonomous‑agent platform on Alibaba Cloud Serverless Application Engine (SAE), covering architecture, prerequisites, Docker‑in‑Docker setup, sandbox configuration, model integration, and DingTalk AI assistant creation.

Alibaba Cloud Native
Alibaba Cloud Native
Alibaba Cloud Native
Deploy OpenClaw Autonomous Agents on Alibaba Cloud SAE – Full Step‑by‑Step Guide

OpenClaw is an open‑source autonomous‑agent framework that runs as a lightweight CLI‑driven gateway service, providing a secure, persistent, and extensible runtime for AI agents. Its architecture separates the Agent (decision core), Skills (capability boundary), and a gateway that coordinates interaction, memory, and execution.

Why Host OpenClaw on SAE

Alibaba Cloud Serverless Application Engine (SAE) offers a full‑featured container environment with serverless resource scheduling, making it ideal for OpenClaw’s Docker‑based sandbox execution, 24/7 task handling, and high‑availability requirements.

Prerequisites

SAE Serverless Application Engine service enabled and authorized. saectl command‑line tool installed and configured (see Saectl documentation).

VPC with a public NAT gateway and an Elastic IP for sandbox container outbound access.

Step 1 – Deploy OpenClaw via SAE Application Center

Log in to the SAE console and open Application Center .

Search for the template "OpenClaw — Serverless Deployment" and create a new service.

Fill in the required fields (e.g., service instance name, VPC, vSwitch) and keep other parameters at their defaults.

Click Create ; the service is provisioned in 2–3 minutes, creating an application named openclaw‑gateway.

Step 2 – Initialize the OpenClaw Runtime

Access the gateway container either via the SAE WebShell or the saectl CLI:

saectl exec -it -n <namespace> <pod‑name>

Inside the container:

stty rows 40 cols 120
openclaw onboard --install-daemon

This interactive command sets up the local daemon and persists configuration.

Step 3 – Configure Model Provider (DashScope)

openclaw config set models.providers.dashscope '{
  "baseUrl": "https://dashscope.aliyuncs.com/compatible-mode/v1",
  "api": "openai-completions",
  "apiKey": "your-api-key-here",
  "models": [{
    "id": "qwen3-max-2026-01-23",
    "name": "qwen3-max-2026-01-23",
    "reasoning": false,
    "input": ["text"],
    "cost": {"input":0,"output":0,"cacheRead":0,"cacheWrite":0},
    "contextWindow": 262144,
    "maxTokens": 65536
  }]
}'

Replace your-api-key-here with a valid DashScope API key, then set it as the default model:

openclaw config set agents.defaults.model.primary "dashscope/qwen3-max-2026-01-23"

Step 4 – Enable and Configure the Sandbox

# Enable full sandbox mode
openclaw config set agents.defaults.sandbox.mode "all"
# Base Docker image for code execution
openclaw config set agents.defaults.sandbox.docker.image "openclaw-sandbox:bookworm-slim"
# Network mode (bridge allows external access)
openclaw config set agents.defaults.sandbox.docker.network "bridge"
# Enable browser automation sandbox
openclaw config set agents.defaults.sandbox.browser.enabled true
# Browser sandbox image
openclaw config set agents.defaults.sandbox.browser.image "openclaw-sandbox-browser:bookworm-slim"
# Browser network mode
openclaw config set agents.defaults.sandbox.browser.network "bridge"

Step 5 – Start the Gateway Service

supervisorctl start openclaw

After any configuration changes, restart with supervisorctl restart openclaw.

Accessing OpenClaw UI

OpenClaw offers a TUI and a Web Control UI. To use the TUI: openclaw tui For the Web UI, ensure the gateway bind address is set to "lan" (allow external access) and note the port (default 18789). Update the bind setting if needed:

openclaw config set gateway.bind "lan"
supervisorctl restart openclaw

Integrating with DingTalk – Build an AI Assistant

1. Create a DingTalk Application

Log in to the DingTalk Open Platform with developer permissions.

Create a new application, record the Client ID and Client Secret .

2. Create an AI Message Card Template

Navigate to the Card Platform, create a new AI message card template, and save the template ID.

3. Set Up an AppFlow Connection

In AppFlow, create a new connection flow using the DingTalk application credentials and the OpenClaw gateway webhook URL (e.g., https://<CLB_PUBLIC_IP>:18789).

Enter the saved card template ID and complete the flow configuration.

4. Configure the DingTalk Robot

Enable the robot in the DingTalk app, set the message receiving mode to HTTP, and paste the OpenClaw webhook URL.

Grant the robot permissions Card.Streaming.Write and Card.Instance.Write.

5. Deploy the Robot

Publish a new version of the DingTalk application.

Add the robot to a DingTalk group; users can now @ the robot to start conversations with the OpenClaw agent.

For detailed reference, see the official OpenClaw documentation and Alibaba Cloud SAE guides.

OpenClaw architecture diagram
OpenClaw architecture diagram
SAE deployment UI
SAE deployment UI
cloud nativeDockerserverlessDingTalk
Alibaba Cloud Native
Written by

Alibaba Cloud Native

We publish cloud-native tech news, curate in-depth content, host regular events and live streams, and share Alibaba product and user case studies. Join us to explore and share the cloud-native insights you need.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.