Tag

sentence embeddings

0 views collected around this technical thread.

DaTaobao Tech
DaTaobao Tech
Apr 12, 2022 · Artificial Intelligence

ArcCSE: Angular Margin Contrastive Learning for Self‑Supervised Text Representation

ArcCSE introduces an angular‑margin contrastive loss and both pairwise (dropout‑augmented) and triple‑wise (span‑masked) relationship modeling to self‑supervise text embeddings, yielding tighter decision boundaries, higher alignment and uniformity, and superior performance on unsupervised STS, SentEval, and Alibaba’s retrieval and recommendation systems.

NLPangular margincontrastive learning
0 likes · 8 min read
ArcCSE: Angular Margin Contrastive Learning for Self‑Supervised Text Representation
DataFunSummit
DataFunSummit
Feb 12, 2022 · Artificial Intelligence

Advances and Challenges in Post‑BERT Semantic Matching: Negative Sampling, Data Augmentation, and Applications

After the BERT era, this article reviews the limitations of pre‑trained language models for semantic matching, discusses negative‑sample sampling, data‑augmentation techniques, contrastive learning methods such as ConSERT and SimCSE, and practical deployment considerations in vector‑based retrieval systems.

Vector Retrievalcontrastive learningdata augmentation
0 likes · 20 min read
Advances and Challenges in Post‑BERT Semantic Matching: Negative Sampling, Data Augmentation, and Applications
HaoDF Tech Team
HaoDF Tech Team
Sep 15, 2021 · Artificial Intelligence

Optimizing Question‑Answer Search Similarity in Haodf Online: A Semantic Similarity Model Case Study

This article describes how Haodf Online improved its medical question‑answer search by analyzing search challenges, adopting semantic similarity models based on pre‑trained language embeddings, designing contrastive training tasks, and evaluating the resulting increase in click‑through rate and user engagement.

Medical AINatural Language ProcessingSearch Relevance
0 likes · 12 min read
Optimizing Question‑Answer Search Similarity in Haodf Online: A Semantic Similarity Model Case Study
DataFunTalk
DataFunTalk
Jun 6, 2021 · Artificial Intelligence

ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

ConSERT introduces a contrastive self‑supervised framework that enhances BERT‑derived sentence embeddings by applying efficient embedding‑level data augmentations, achieving significant improvements on semantic textual similarity tasks, especially in low‑resource settings, and outperforming previous state‑of‑the‑art methods.

BERTcontrastive learningself-supervised
0 likes · 20 min read
ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer