Artificial Intelligence 13 min read

Deep Learning Applications in 58.com Intelligent Recommendation System

This article details how 58.com leverages deep learning models such as FNN, Wide&Deep, CNN+DNN, and YouTube DNN recall, along with a custom AI platform, to enhance recommendation ranking and recall, achieving measurable improvements in click‑through rates and overall system performance.

58 Tech
58 Tech
58 Tech
Deep Learning Applications in 58.com Intelligent Recommendation System

58.com, China’s largest classified information platform, built an intelligent recommendation system to help users discover valuable content and improve experience; this article focuses on the practical application of deep learning within that system.

FNN (Factorization‑machine supported Neural Networks) uses FM to transform sparse features into dense embeddings, which are then fed into a neural network. Distributed FM (DiFacto) trains feature weights and latent vectors, which are concatenated per field and input to the NN. Implemented in January 2017, the FNN model raised post‑click rate by about 5% compared with the baseline.

Wide&Deep combines a wide linear model (using raw one‑hot features and manually crossed features) with a deep neural network that learns embeddings for sparse inputs. Using TensorFlow’s open‑source implementation, distributed training increased offline AUC by 0.026. Unlike FNN, Wide&Deep learns embeddings end‑to‑end.

To further enrich features, high‑order features generated by XGBoost (GBDT) were fed into the wide part, creating an XGBoost+Wide&Deep model that added 0.018 offline AUC, while feeding the same high‑order vectors into the deep part gave a modest additional gain.

CNN+DNN addresses scenarios with large image assets (e.g., high‑resolution housing photos). A pre‑trained CNN extracts image embeddings, which are concatenated with other dense and sparse features and fed into the deep part of Wide&Deep. This architecture improved offline AUC by ~0.01.

DNN Recall Model (YouTube DNN recall) treats recommendation as a massive multi‑class classification problem, predicting the probability of a user clicking each item. User history is embedded and combined with demographic and context features. The final hidden layer provides item vectors for nearest‑neighbor search using FAISS, yielding ~1% stable CTR lift over the baseline recall model.

The entire workflow runs on the 58 Deep Learning Platform (WPAI) , which comprises a hardware layer (GPU/CPU nodes), a cluster management layer (Kubernetes, Docker/Nvidia‑Docker, Calico, etcd), an algorithm application layer (TensorFlow, Caffe, with packaged DNN, CNN, RNN models), and a web management layer for resource requests, job monitoring, and model serving. Offline training uses distributed TensorFlow; online inference is served via TensorFlow‑Serving wrapped by the proprietary SCF RPC framework.

In summary, the deployment of FNN, Wide&Deep, CNN+DNN for ranking and the YouTube DNN model for recall has demonstrably improved recommendation performance at 58.com, and future work will continue to refine these models and explore deep reinforcement learning.

CNNdeep learningrecommendation systemDNNwide & deepFNN
58 Tech
Written by

58 Tech

Official tech channel of 58, a platform for tech innovation, sharing, and communication.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.