Tag

Prompt Tuning

0 views collected around this technical thread.

Alimama Tech
Alimama Tech
Apr 23, 2025 · Artificial Intelligence

Distribution-aware Graph Prompt Tuning (DAGPrompT) for Heterophilic Graphs

Distribution‑aware Graph Prompt Tuning (DAGPrompT) tackles the pre‑training/downstream mismatch on heterophilic graphs by jointly applying low‑rank GLoRA adaptation and hop‑specific prompts that recast tasks as link‑prediction, yielding up to 4.79% accuracy gains and an average 2.43% improvement in few‑shot node classification.

Graph Neural NetworksPretrainingPrompt Tuning
0 likes · 9 min read
Distribution-aware Graph Prompt Tuning (DAGPrompT) for Heterophilic Graphs
ByteDance Web Infra
ByteDance Web Infra
Jun 16, 2023 · Artificial Intelligence

How AIGC Transforms Document Search: Architecture, Techniques, and Future Directions

This article explains how AI‑generated content (AIGC) reshapes document search by combining traditional indexing with modern embedding and prompt‑tuning techniques, reviews key components such as LangChain and Supabase, compares existing AI‑search products, and discusses the future blend of classic and AI‑driven search.

AI SearchAIGCLangChain
0 likes · 15 min read
How AIGC Transforms Document Search: Architecture, Techniques, and Future Directions
Kuaishou Tech
Kuaishou Tech
Apr 24, 2023 · Artificial Intelligence

Divide‑and‑Conquer Embedding‑Based Retrieval with Prompt‑Based Multi‑Task Learning for Large‑Scale Recommendation

This paper identifies the trade‑off between simple and hard negatives in embedding‑based retrieval for recommendation, proposes a clustering‑based divide‑and‑conquer framework combined with prompt‑driven multi‑task learning to improve relevance, diversity, and fairness, and validates the approach through offline metrics, online A/B tests, and comparative experiments.

Prompt TuningRecommendation systemsapproximate nearest neighbor
0 likes · 9 min read
Divide‑and‑Conquer Embedding‑Based Retrieval with Prompt‑Based Multi‑Task Learning for Large‑Scale Recommendation
Sohu Tech Products
Sohu Tech Products
Mar 22, 2023 · Artificial Intelligence

An Overview of Prompt Learning in Natural Language Processing

This article reviews the evolution of NLP training paradigms, explains why prompt learning is needed, defines its core concepts, and surveys major hard‑template and soft‑template methods such as PET, LM‑BFF, P‑tuning, and Prefix‑tuning, highlighting their advantages for few‑shot and zero‑shot scenarios.

NLPPrompt Tuningfew-shot
0 likes · 10 min read
An Overview of Prompt Learning in Natural Language Processing
Xiaohongshu Tech REDtech
Xiaohongshu Tech REDtech
Nov 11, 2022 · Artificial Intelligence

Language Model as a Service and Black‑Box Optimization: Insights from Prof. Qiu Xipeng’s Talk

Prof. Qiu Xipeng’s talk highlighted how large language models can be offered as a service and efficiently adapted via in‑context learning, lightweight label‑tuning, and gradient‑free black‑box optimization, showcasing a unified asymmetric Transformer (CPT) that handles understanding, generation, ABSA and NER tasks while reducing resource demands.

LLMNLPPretraining
0 likes · 15 min read
Language Model as a Service and Black‑Box Optimization: Insights from Prof. Qiu Xipeng’s Talk