DataFunTalk
Aug 7, 2021 · Artificial Intelligence
Multi-Category Mixture-of-Experts Model for JD Search Ranking
This article presents a multi‑category Mixture‑of‑Experts (MoE) approach for e‑commerce search ranking, addressing category‑specific behavior and small‑category learning by introducing hierarchical soft constraints and adversarial regularization, and demonstrates significant AUC and NDCG gains on Amazon and JD in‑house datasets.
Adversarial RegularizationHierarchical Soft ConstraintMixture of Experts
0 likes · 10 min read