Artificial Intelligence 11 min read

Personalized Advertising Ranking and Intelligent Bidding in iQIYI Effect Advertising

This article presents iQIYI's effect advertising system, detailing its dual-engine resource layout, oCPX billing model, algorithmic challenges of high‑dimensional sparse conversion data, the multi‑stage personalized recommendation pipeline, eCPM‑based ranking, online training/inference workflow, and intelligent bidding strategies that balance cost control and traffic quality.

DataFunSummit
DataFunSummit
DataFunSummit
Personalized Advertising Ranking and Intelligent Bidding in iQIYI Effect Advertising

Introduction With advances in big data and AI, iQIYI's effect advertising has evolved to provide massive reach, precise targeting, and significant results across platforms and devices. The following sections share the system's architecture, core challenges, and practical solutions.

1. iQIYI Effect Advertising Algorithm – Background & Architecture

iQIYI's ad inventory is divided into two main engines: feed flow and in‑frame . Feed flow includes quasi‑feed (non‑native recommendations) and pure feed (native channels such as hot topics). In‑frame ads are placed in mid‑frame slots, with additional pre‑ and post‑frame placements. Other slots include "You May Like" and video‑related positions.

2. Billing Method

The oCPX (optimized CPX) model unifies oCPC (cost per click) and oCPM (cost per mille). While CPX originally billed by clicks, advertisers actually care about downstream conversions such as app installs, payments, or follow‑ups. oCPX shifts the optimization target from clicks to conversions, allowing advertisers to set conversion goals and bids while the algorithm handles the rest.

Challenges include high‑dimensional sparse conversion samples, multiple conversion types (install, payment, etc.), and the need for low‑latency computation on massive traffic.

3. Algorithmic Challenges

Conversion samples are extremely high‑dimensional and sparse, with very few positive instances.

oCPX must support diverse conversion types (install, payment, public‑account follow, etc.).

Platform scale demands high‑throughput, low‑latency model inference.

4. Personalized Advertising Recommendation Process

The pipeline consists of three stages:

Recall : Candidate ads are retrieved based on user targeting criteria.

Coarse Ranking : Lightweight models perform an initial selection to reduce load on the fine‑ranking stage, also handling cold‑start and exploration to mitigate the Matthew effect.

Fine Ranking : High‑precision models estimate click‑through rate, conversion rate, and apply intelligent bidding. Budget smoothing across time slots is also supported.

5. Ranking Logic

Both coarse and fine ranking use eCPM (effective cost per mille) as the sorting metric: eCPM = CTR × CVR × Bid × Smart‑Bid Factor . This reflects the expected revenue per thousand impressions.

6. Online Training & Inference Workflow

Features flow from bottom to top:

Real‑time features that accurately reflect the online environment.

Model training includes FM per‑day models, FM online learning, deep learning (Wide&Deep), and reinforcement learning.

Trained models are deployed online, combining offline and real‑time features for inference.

7. Click & Conversion Rate Estimation – Core Issues

Feature engineering distinguishes real‑time, short‑term, and long‑term features. Real‑time features capture context, time, and feedback. Short‑term features reflect recent viewing interests, while long‑term features include demographics, long‑term interests, and ad attributes.

Key problems addressed:

High‑dimensional sparsity – solved by dynamic bucket sizing based on historical positive samples.

Latency – online FTRL‑based FM models update every minute using Kafka streams, with monitoring to switch to backup models if metrics degrade.

Conversion lag – treat clicks as negative samples initially, converting to positive once conversion occurs, with weight adjustments.

Conversion spikes – incorporate non‑target conversions (e.g., downloads, activations) as weighted positives to smooth volume.

8. Intelligent Bidding – Game Theory & Win‑Win

The goal is cost control while maximizing volume. An intelligent bidding factor modifies eCPM: if actual cost exceeds target, the factor < 1 reduces bid; if cost is below target, the factor > 1 increases competitiveness.

Further refinement selects traffic at the granularity of quality, applying a function g that balances cost and traffic quality, enabling more precise bidding decisions.

For more details, see related articles on AI in iQIYI video ads, iQIYI search ranking model evolution, and the evolution of iQIYI's big data analysis platform.

advertisingbig datamachine learningclick-through rateranking algorithmintelligent biddingconversion estimation
DataFunSummit
Written by

DataFunSummit

Official account of the DataFun community, dedicated to sharing big data and AI industry summit news and speaker talks, with regular downloadable resource packs.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.