How Graph-as-a-Service Revolutionizes Alibaba’s Ad Engine with Serverless Architecture

This article explains how Alibaba Mama’s advertising engine adopted a Serverless, Graph‑as‑a‑Service (GaaS) model to unify budgeting across multiple ad lands, align development‑to‑delivery granularity with graph structures, and dramatically improve testing, experimentation, and production release efficiency.

Alimama Tech
Alimama Tech
Alimama Tech
How Graph-as-a-Service Revolutionizes Alibaba’s Ad Engine with Serverless Architecture

Background and the "Budget/Land" Theme

Before 2021, Alibaba Mama’s ad business operated separate search, display, and live‑streaming engines. The launch of the "Wanxiang Platform" allowed a single budget to flow across all ad lands, and the later "Wanxiang Platform Boundary‑less" version let that budget move freely among keyword, audience, product, and short‑video scenes, giving advertisers more flexibility and providing engineers new optimization levers.

Serverless Engine Architecture and Graph‑as‑a‑Service (GaaS)

To support the "budget‑through" concept, the team introduced a Serverless advertising engine architecture that enables "develop‑once‑integrate‑as‑needed". The engine’s capability model is expressed as a directed acyclic graph (DAG), leading to the new "Graph as a Service" (GaaS) paradigm, which improved development self‑test efficiency by 1× and change‑release efficiency by 70%.

Core GaaS Components

Capability‑centric reconstruction : a full‑graph engine that reshapes engine logic at the granularity of business capabilities.

Capability‑oriented development : compile‑deploy units built from operators, orchestration, and data, aligning business‑level "capability" with engineering‑level "graph".

Capability‑oriented self‑test : lightweight local testing replaces heavyweight integration tests.

Capability‑oriented experimentation : graph‑level experiments avoid linear cost growth when reusing graphs.

Capability‑oriented release : versioned capability deployment aligns production changes across modules.

GaaS core capability architecture
GaaS core capability architecture

Three Unifications Enabling Cross‑Engine Logic Recognition

Unified runtime : switch from control‑flow to data‑flow, decoupling business orchestration from low‑level execution and achieving runtime mutual recognition.

Unified data abstraction : model heterogeneous data as two‑dimensional tables (Table) managed by the framework, allowing transparent consumption and reducing heavy data dependencies.

Unified business abstraction : standardize operator schemas, eliminate large session buses, and achieve stateless, atomized logic fragments that can drift across modules.

Graph Modeling and Deployment

The engine is built on the EADS graph framework, where a Graph can contain Python TableAPI DAG definitions, C++ UDF plugins, and schema definitions. Graphs are invoked via TableAPI calls, and compiled according to Bazel BUILD scripts, supporting inline, local, or remote dependencies similar to static/dynamic linking in C++.

load("eads_graph")

eads_graph(
    name = "bidding_graph",
    graph_main = "bidding_graph.py",
    py_deps = [],
    eads_udfs = [":bidding_udf"],
    eads_datas = [":my_promotion_bidding_data"],
    graph_deps = ["//graph:fetch_param_graph"],
)

Local Testing Workflow

Test graph construction : mock input tables and rewrite the original graph into a test graph.

Local runtime : build a small index, compile the sub‑graph, and run it in minutes.

Data construction & result assertion : use Python Table objects to create inputs and verify outputs.

from turing_script.ad_table import AdTable
from bidding.graph import bidding_graph

class BiddingTestCase(EADSGraphTestCase):
    def test_demo(self):
        ad_table = AdTable.from_csv_file("ad_input.csv")
        in_tables = {"ad": ad_table}
        expect_table = AdTable.from_csv_file("ad_output.csv")
        out_tbls = self.run_graph(inputs=in_tables, fetches=["ad"])
        assert table_equal(ad_table, expect_table)

Experimentation and Production Release

Graph lineage analysis : automatically extract module topology after code changes for precise experiment targeting.

Graph‑centric experiments : trigger experiments from the graph view, automatically configuring traffic for all affected modules.

Global experiment analysis : aggregate metrics across business scenarios and provide deep‑dive diagnostics.

These capabilities enable "develop‑once‑validate‑everywhere" with a 70% boost in experiment efficiency and a 75% boost in release efficiency.

Case Study: Smart Bidding

Smart bidding combines a C++ personalized bidding operator, a reusable bidding core sub‑graph, and algorithmic data. Before GaaS, integration required manual compile configuration and tightly coupled validation. GaaS consolidates the logic into a single graph asset, allowing local high‑fidelity testing, one‑click global experiments, and graph‑driven production releases.

load("eads_graph")

eads_graph(
    name = "bidding_graph",
    graph_main = "bidding_graph.py",
    eads_udfs = [":bidding_udf"],
    eads_datas = [":my_promotion_bidding_data"],
    graph_deps = ["//graph:fetch_param_graph"],
)

Future Outlook

Finer‑grained logic decomposition provides clearer domain modeling for large‑model understanding.

Precise capability decomposition yields stricter engineering specifications for model‑generated code.

Graph‑level testability offers stronger quality controls for AI‑assisted verification.

serverlessArchitecturegraphAdTechGaaS
Alimama Tech
Written by

Alimama Tech

Official Alimama tech channel, showcasing all of Alimama's technical innovations.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.