PEPNet: Parameter and Embedding Personalized Network for Multi‑Task Multi‑Domain Recommendation
The paper introduces PEPNet, a plug‑and‑play network that tackles the domain‑seesaw and task‑seesaw problems in multi‑scenario recommendation by using a gated personalization module (GateNU) together with embedding‑level (EPNet) and parameter‑level (PPNet) personalization, and demonstrates its superiority through extensive offline and online experiments on Kuaishou data.
The authors identify two critical challenges in large‑scale recommendation systems that operate across multiple business scenarios and interaction tasks: (1) the domain seesaw problem, where training separate models for each scenario is costly and mixing data ignores domain‑specific distributions; and (2) the task seesaw problem, where imbalanced task sparsity and inter‑task dependencies lead to sub‑optimal performance.
To address these issues, they propose PEPNet (Parameter and Embedding Personalized Network) , an efficient plug‑and‑play architecture composed of three modules:
GateNU : a lightweight gating unit with two fully‑connected layers (ReLU followed by Sigmoid) whose output is scaled by a hyper‑parameter (set to 2) to generate personalized gates.
EPNet (Embedding Personalized Network) : incorporates domain‑specific IDs and statistical features into the shared embedding layer, combines them with a stop‑gradient operation, and applies the GateNU‑generated weights to produce domain‑aware embeddings.
PPNet (Parameter Personalized Network) : injects user/item/author priors (e.g., IDs, age, video category) into the DNN tower layers, using GateNU to scale each layer’s parameters, thereby personalizing the entire tower.
The overall model integrates EPNet and PPNet so that both embedding representations and tower parameters are personalized for each user, mitigating the double seesaw phenomenon.
Extensive experiments were conducted on three Kuaishou scenarios (fast‑version discovery,精选页, and double‑column discovery) covering six tasks (Like, Follow, Forward, Hate, Click, Effective View). Offline results show that PEPNet consistently outperforms strong baselines such as DeepFM, DCN, xDeepFM, DCNv2, SharedBottom, MMoE, PLE, and various multi‑domain variants. Ablation studies confirm that removing either EPNet or PPNet degrades performance, and adding PEPNet to other models yields noticeable gains. Hyper‑parameter analysis reveals optimal settings: embedding dimension growth benefits EPNet, PPNet performs best with four DNN layers, and the GateNU scaling factor of 2 is optimal.
Online A/B testing across the three scenarios demonstrates that PEPNet improves key metrics (likes, follows, forwards, watch time) by over 1% and serves more than 300 million daily users, validating its practical impact.
Kuaishou Tech
Official Kuaishou tech account, providing real-time updates on the latest Kuaishou technology practices.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.