Tag

self-supervised

0 views collected around this technical thread.

Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Dec 13, 2023 · Artificial Intelligence

Comprehensive Overview of BERT: Architecture, Pre‑training Tasks, and Applications

This article provides a detailed introduction to BERT, covering its bidirectional transformer encoder design, pre‑training objectives such as Masked Language Modeling and Next Sentence Prediction, model configurations, differences from GPT/ELMo, and a wide range of downstream NLP applications.

BERTMasked Language ModelNLP
0 likes · 17 min read
Comprehensive Overview of BERT: Architecture, Pre‑training Tasks, and Applications
DataFunSummit
DataFunSummit
Feb 6, 2023 · Artificial Intelligence

A Minimalist White‑Box Unsupervised Learning Method Using Sparse Manifold Transform

A recent paper by Prof. Ma Yi and Turing‑Award winner Yann LeCun introduces a simple, interpretable unsupervised learning approach that combines sparse coding, manifold learning, and slow feature analysis, achieving near‑state‑of‑the‑art performance on MNIST, CIFAR‑10, and CIFAR‑100 without data augmentation or extensive hyper‑parameter tuning.

AIdeep learningrepresentation learning
0 likes · 8 min read
A Minimalist White‑Box Unsupervised Learning Method Using Sparse Manifold Transform
DaTaobao Tech
DaTaobao Tech
Jan 9, 2023 · Artificial Intelligence

Adaptive and Self-Supervised Multi-Scenario Modeling for Taobao Personalized Recommendation

On January 9 from 19:00 to 20:00, algorithm engineer Zhang Yuanliang will present Taobao’s scenario-adaptive, self-supervised multi-scenario recommendation model, detailing its architecture, experimental results, and practical deployment for improving personalized item recall across diverse user contexts.

algorithmmulti-scenariopersonalization
0 likes · 1 min read
Adaptive and Self-Supervised Multi-Scenario Modeling for Taobao Personalized Recommendation
DataFunSummit
DataFunSummit
Jun 25, 2022 · Artificial Intelligence

Image and Text Pretraining: Methods, Practices, and Business Applications in Information Flow

This article reviews large‑scale image and multimodal pre‑training techniques—including contrastive learning, self‑supervised reconstruction, and multimodal alignment—explains data acquisition, model construction, evaluation metrics, and demonstrates how these methods are applied and optimized for real‑world information‑flow services.

AIPretrainingcontrastive learning
0 likes · 17 min read
Image and Text Pretraining: Methods, Practices, and Business Applications in Information Flow
DataFunTalk
DataFunTalk
Jun 6, 2021 · Artificial Intelligence

ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

ConSERT introduces a contrastive self‑supervised framework that enhances BERT‑derived sentence embeddings by applying efficient embedding‑level data augmentations, achieving significant improvements on semantic textual similarity tasks, especially in low‑resource settings, and outperforming previous state‑of‑the‑art methods.

BERTcontrastive learningself-supervised
0 likes · 20 min read
ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer