Alimama Tech
May 25, 2022 · Artificial Intelligence
UKD: Debiasing Conversion Rate Estimation via Uncertainty-regularized Knowledge Distillation
The paper introduces UKD, an uncertainty‑regularized knowledge‑distillation framework that uses a click‑adaptive teacher to generate pseudo‑conversion labels for unclicked impressions and trains a student model with uncertainty‑weighted loss, thereby mitigating sample‑selection bias and achieving up to 3.4% CVR improvement and 4.3% CPA reduction on large‑scale advertising datasets.
CVR debiasingadvertising algorithmsconversion rate estimation
0 likes · 20 min read