Amap Tech
Amap Tech
Sep 2, 2025 · Artificial Intelligence

How Pos2Distill Eliminates Positional Bias in Large Language Models

This article introduces Pos2Distill, a novel knowledge‑distillation framework that transfers capabilities from advantageous to disadvantaged positions in large language models, effectively mitigating positional bias and improving performance on long‑text retrieval and in‑context reasoning tasks.

Knowledge DistillationLarge Language Modelsin-context reasoning
0 likes · 10 min read
How Pos2Distill Eliminates Positional Bias in Large Language Models