Tag

sparse features

0 views collected around this technical thread.

Ximalaya Technology Team
Ximalaya Technology Team
Oct 9, 2023 · Artificial Intelligence

DeepRec-Based High-Dimensional Sparse Feature Support and Real-Time Model Training in Ximalaya AI Cloud

Ximalaya AI Cloud leverages DeepRec’s Embedding Variable to elastically manage high‑dimensional sparse features with low collision, supporting admission/eviction, multi‑level storage and minute‑level incremental model updates, which together boost GPU utilization, halve training time and improve recommendation CTR by 2‑3 % while maintaining latency.

AI CloudDeepRecRecommendation systems
0 likes · 13 min read
DeepRec-Based High-Dimensional Sparse Feature Support and Real-Time Model Training in Ximalaya AI Cloud
AntTech
AntTech
Jul 12, 2023 · Artificial Intelligence

Hybrid Embedding Architecture for Large‑Scale Sparse CTR Models

This article describes the Hybrid Embedding solution proposed by Ant AI Infra to address storage, resource, and feature‑governance challenges of massive sparse CTR models, detailing its multi‑layer storage design, KV‑based parameter server, and performance gains in large‑scale recommendation systems.

AI InfraHybrid Embeddingctr
0 likes · 9 min read
Hybrid Embedding Architecture for Large‑Scale Sparse CTR Models
360 Zhihui Cloud Developer
360 Zhihui Cloud Developer
Sep 16, 2020 · Artificial Intelligence

How TensorNet Supercharges Sparse Feature Training on TensorFlow

TensorNet is a TensorFlow‑based distributed training framework optimized for massive sparse‑feature models in advertising and recommendation, dramatically reducing parameter sync overhead, enabling near‑infinite feature dimensions, cutting training time from hours to minutes, and boosting inference performance by up to 35%.

AIRecommendation systemsTensorFlow
0 likes · 10 min read
How TensorNet Supercharges Sparse Feature Training on TensorFlow
360 Tech Engineering
360 Tech Engineering
Sep 14, 2020 · Artificial Intelligence

TensorNet: A Distributed Training Framework Optimized for Large-Scale Sparse Feature Models on TensorFlow

TensorNet is a TensorFlow‑based distributed training framework that tackles the challenges of massive data and billions of sparse parameters in advertising and recommendation systems by enabling near‑infinite sparse feature dimensions, drastically reducing synchronization overhead, and delivering up to 35% inference speed improvements.

AI infrastructureTensorFlowdistributed training
0 likes · 8 min read
TensorNet: A Distributed Training Framework Optimized for Large-Scale Sparse Feature Models on TensorFlow
DataFunTalk
DataFunTalk
May 15, 2020 · Artificial Intelligence

Optimizing Sparse Feature Embedding for Large‑Scale Recommendation and CTR Prediction

The article reviews recent research on representing massive sparse features in click‑through‑rate (CTR) models, introducing Alibaba's Res‑embedding method and Google's Neural Input Search (NIS) approach, and discusses how these techniques improve embedding efficiency and model generalization in large‑scale recommendation systems.

CTR predictionRecommendation systemsdeep learning
0 likes · 10 min read
Optimizing Sparse Feature Embedding for Large‑Scale Recommendation and CTR Prediction
DataFunTalk
DataFunTalk
Apr 12, 2020 · Artificial Intelligence

Wang Zhe’s Machine Learning Notes – Answers to Frequently Asked Questions on Recommendation Systems

In this article, Wang Zhe addresses fifteen common questions about recommendation systems, covering topics such as building cross‑domain knowledge, the role of deep reinforcement learning, handling sparse or low‑sample data, offline‑online evaluation, knowledge graphs, graph neural networks, model interpretability, large‑scale ID embedding, and career advice for engineers.

Recommendation systemsdeep learninggraph neural network
0 likes · 14 min read
Wang Zhe’s Machine Learning Notes – Answers to Frequently Asked Questions on Recommendation Systems