Adaptive Information Transfer Multi-task (AITM) Framework for Sequential User Conversion Modeling in Targeted Display Advertising

The Adaptive Information Transfer Multi‑task (AITM) framework integrates multi‑task learning with an attention‑based information‑transfer module to jointly model the sequential conversion chain in targeted display ads, mitigating class imbalance and boosting end‑to‑end user acquisition rates, as demonstrated by offline and online experiments.

Meituan Technology Team
Meituan Technology Team
Meituan Technology Team
Adaptive Information Transfer Multi-task (AITM) Framework for Sequential User Conversion Modeling in Targeted Display Advertising

Many applications need targeted display ads for user acquisition; credit‑card advertising involves a long multi‑step conversion chain. This article introduces the Adaptive Information Transfer Multi‑task (AITM) framework, which combines multi‑task learning with adaptive information transfer to model sequential dependencies among user conversions and improve end‑to‑end acquisition rates.

Background

The conversion process typically follows: Impression → Click → Application → Approval → Activation. The later stages (approval, activation) have sparse positive samples and severe class imbalance. Multi‑task learning can leverage abundant positive samples from earlier tasks to alleviate this imbalance.

System Overview

The deployed system in the Meituan app includes four tasks (click, application, approval, activation). A shared embedding layer feeds multiple task‑specific tower networks. An Adaptive Information Transfer (AIT) module transfers information between adjacent tasks using an attention mechanism, allowing the model to learn what and how much information to transfer at each stage.

Model

The AIT module computes attention weights to blend transferred information with the current task’s representation. A behavior‑expectation calibrator is added to the loss function (cross‑entropy + calibrator) to enforce sequential constraints, penalizing violations of the conversion order.

Experiments

Offline experiments on an industrial Meituan dataset and a public Alibaba dataset show AITM outperforms strong baselines (LightGBM, MLP) in AUC. Online A/B tests confirm a significant lift in conversion metrics. Ablation studies, hyper‑parameter analyses, and case studies demonstrate the effectiveness of the AIT module and the calibrator in handling sparse positive samples and sequential dependencies.

Conclusion

AITM successfully models sequential dependencies in multi‑step conversions, achieving notable gains in both offline and online settings. The framework has been deployed in Meituan’s credit‑card advertising pipeline to serve high‑conversion users in real time.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

machine learningMulti-Task Learningconversion rateSequential Modelingtargeted advertisingAITM
Meituan Technology Team
Written by

Meituan Technology Team

Over 10,000 engineers powering China’s leading lifestyle services e‑commerce platform. Supporting hundreds of millions of consumers, millions of merchants across 2,000+ industries. This is the public channel for the tech teams behind Meituan, Dianping, Meituan Waimai, Meituan Select, and related services.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.