Artificial Intelligence 12 min read

Survey of Classic Recommendation Algorithms: LR, FM, FFM, WDL, DeepFM, DCN, and xDeepFM

This article surveys classic recommendation algorithms—including Logistic Regression, Factorization Machines, Field‑aware FM, Wide & Deep, DeepFM, DCN, and xDeepFM—explaining their principles, feature preprocessing, problem scopes, and industrial applications within personalized recommendation systems.

DataFunTalk
DataFunTalk
DataFunTalk
Survey of Classic Recommendation Algorithms: LR, FM, FFM, WDL, DeepFM, DCN, and xDeepFM

Personalized recommendation systems aim to predict user interests by analyzing behavior and matching users with items. This article reviews classic recommendation algorithms, focusing on their mathematical foundations, the problems they address, and practical usage.

Feature preprocessing : User and item attributes are categorized as continuous (e.g., price, sales) or categorical (e.g., gender, membership level). Continuous features are normalized, while categorical features are transformed via one‑hot or hash encoding, resulting in fully numeric inputs suitable for model training.

Logistic Regression (LR) : A linear model combined with a sigmoid function to output click‑through probabilities. It is simple, interpretable, and efficient for online inference.

Factorization Machine (FM) : Extends LR by modeling second‑order feature interactions through factorized parameters, reducing the parameter explosion of full pairwise interactions and achieving linear time complexity.

Field‑aware FM (FFM) : Introduces the concept of fields, assigning separate latent vectors for each feature‑field pair, which improves expressive power at the cost of higher computational overhead.

Wide & Deep Learning (WDL) : Combines a wide linear component (memory) with a deep neural network (generalization) to capture both low‑ and high‑order feature interactions.

DeepFM : Merges FM and a deep neural network, sharing the embedding layer to eliminate manual feature engineering while learning both first‑ and second‑order interactions.

Deep & Cross Network (DCN) : Replaces FM with a cross network that explicitly constructs high‑order feature crosses without manual engineering, achieving linear complexity per layer.

xDeepFM : Enhances DCN by performing vector‑wise (rather than bit‑wise) feature crosses, integrating explicit high‑order interactions with deep learning for richer representation.

Conclusion : Each algorithm was introduced to solve specific trade‑offs between accuracy and efficiency. While academic evaluation emphasizes predictive performance, industry prioritizes a balance of precision, speed, and deployment simplicity, as illustrated by the diverse real‑world applications listed.

machine learningFeature EngineeringDeep Learningrecommendation systemsFactorization Machines
DataFunTalk
Written by

DataFunTalk

Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.