Artificial Intelligence 11 min read

iQIYI Effect Advertising: Algorithm Architecture, Click‑Conversion Estimation, and Smart Bidding

The talk details iQIYI’s effect advertising system, describing its feed and in‑frame architecture, the oCPX billing model, multi‑stage recall‑ranking pipelines, real‑time feature engineering, online FM and Wide&Deep models for sparse conversion prediction, and a smart‑bidding mechanism that balances cost, quality, and volume.

iQIYI Technical Product Team
iQIYI Technical Product Team
iQIYI Technical Product Team
iQIYI Effect Advertising: Algorithm Architecture, Click‑Conversion Estimation, and Smart Bidding

Guest Introduction Wang Hui, Senior Engineer at iQIYI.

Overview With the rapid development of big data and artificial intelligence, the digital marketing industry is constantly evolving. Leveraging iQIYI's strong technical foundation, its effect advertising platform has achieved massive reach, precise targeting, and significant results. This talk shares the author's thoughts and practice on personalized effect advertising, focusing on ad ranking algorithms.

1. iQIYI Effect Advertising Algorithm – Background & Architecture

Advertisement resources are divided into two main engines: feed (information flow) and in‑frame placements.

• Feed : composed of quasi‑feed (non‑native recommendations) and pure feed (native channels such as hot topics). • In‑frame : ads placed in the middle of frames, as well as pre‑ and post‑frame placements. • Other : recommendation slots, video‑related slots, etc.

2. Billing Model – oCPX

oCPX (optimized CPX) unifies oCPC (cost‑per‑click) and oCPM (cost‑per‑mille). Traditional CPX charges by clicks or views, but advertisers actually care about downstream conversions (download, install, payment). oCPX shifts the optimization target from clicks to conversions, allowing advertisers to set a conversion goal and bid, while the algorithm handles the rest.

Challenges include high‑dimensional sparse conversion samples, multiple conversion types (install, payment, WeChat follow), and the need for low‑latency computation on massive traffic.

3. Algorithm Challenges

Extremely sparse high‑dimensional conversion samples.

Complex business logic (multiple conversion types).

High throughput and strict timeliness requirements.

4. Personalized Advertising Recommendation Process

Recall – candidate ad retrieval (e.g., audience targeting).

Coarse Ranking – lightweight models for initial selection, including cold‑start and random exploration to mitigate the “rich‑get‑richer” effect.

Fine Ranking – high‑precision models estimating CTR, CVR, and smart bid, with budget smoothing.

5. Ranking Logic

Both coarse and fine ranking use eCPM (effective CPM) as the sorting metric: eCPM = CTR × CVR × Bid × SmartBidFactor .

6. Online Training & Inference Workflow

Real‑time features (scene, time, feedback) that reflect the current user context.

Short‑term features (recent interests, search, social behavior).

Long‑term features (demographics, long‑term interests, ad attributes, material quality).

Models include FM (day‑wise, online learning), deep learning (Wide&Deep), and reinforcement learning. Online FM models are updated every minute using Kafka streams; a monitoring system triggers automatic fallback to offline models if metrics (e.g., AUC) degrade.

7. Click & Conversion Rate Estimation – Core Issues

Feature Engineering

Features are categorized by latency: real‑time, short‑term, and long‑term. Real‑time features capture the immediate context (what video the user watched before the ad, time of day, recent feedback). Short‑term features reflect recent interests that may shift quickly (e.g., a pregnant user switching from fantasy to maternity content). Long‑term features include stable demographics and historical interests.

Online Learning

Minute‑level model updates using FTRL‑based FM ensure freshness; a watchdog switches to a backup offline model when online performance drops.

Deep Learning

Wide&Deep models are deployed to meet high QPS and stability requirements.

High‑Dimensional Sparsity

Conversion data is extremely sparse; dynamic bucket sizing based on historical positive samples mitigates zero‑count buckets. Lagged conversion (long delay between click and conversion) is handled by treating clicks as negative samples initially and re‑labeling them as positive once conversion occurs, with appropriate weighting. Sudden spikes in conversion volume are smoothed by incorporating non‑target conversions (e.g., download, activation) as weighted positives.

8. Smart Bidding – Game Theory & Win‑Win

Cost Control

The goal is to keep actual cost near the target while maximizing volume. An eCPM multiplier (smart‑bid factor) adjusts based on the ratio of actual to target cost: if actual cost > target, the factor < 1 to lower cost; otherwise > 1 to increase competitiveness.

Traffic Selection

Beyond cost, traffic quality is considered. A function g(cost, quality) balances cost control with quality, preventing low‑quality high‑price traffic from dominating the spend.

Overall, the presentation outlines the end‑to‑end system for personalized effect advertising at iQIYI, covering resource architecture, billing evolution, algorithmic challenges, feature pipelines, model training/inference, and smart bidding strategies.

machine learningfeature engineeringDeep Learningpersonalized recommendationonline learningsmart biddingadvertising algorithm
iQIYI Technical Product Team
Written by

iQIYI Technical Product Team

The technical product team of iQIYI

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.