Tag

negative samples

0 views collected around this technical thread.

DataFunTalk
DataFunTalk
Feb 2, 2024 · Artificial Intelligence

Utilizing Negative Samples for Knowledge Distillation of Large Language Models

This paper presents a novel framework that leverages negative samples during large language model distillation through three stages—Negative Assistive Training, Negative Calibration Enhancement, and Adaptive Self‑Consistency—demonstrating significant accuracy gains on challenging mathematical reasoning benchmarks and improved generalization to out‑of‑distribution tasks.

Chain-of-ThoughtLLM distillationknowledge transfer
0 likes · 13 min read
Utilizing Negative Samples for Knowledge Distillation of Large Language Models
Xiaohongshu Tech REDtech
Xiaohongshu Tech REDtech
Jan 12, 2024 · Artificial Intelligence

Negative Sample Assisted Distillation for Large Language Models

The AAAI‑2024 paper introduces a Negative Sample Assisted Distillation framework—comprising Negative Assistance Training, Negative Calibration Enhancement, and Adaptive Self‑Consistency—that leverages both correct and incorrect reasoning examples to train a compact LLaMA‑7B student, achieving up to 75.75 % accuracy gains over fine‑tuning on MATH and improving out‑of‑domain benchmarks.

Chain-of-ThoughtLLMknowledge distillation
0 likes · 13 min read
Negative Sample Assisted Distillation for Large Language Models