Data Party THU
Aug 27, 2025 · Artificial Intelligence
When Does Dot-Product Attention Switch from Positional to Semantic? A Phase Transition Theory
This paper presents a solvable low‑rank dot‑product attention model and, using high‑dimensional asymptotics and GAMP analysis, derives closed‑form characterizations of global optima that reveal a phase transition between positional and semantic attention mechanisms as sample complexity grows, with empirical validation against linear baselines.
GAMPdot-product attentionhigh-dimensional limit
0 likes · 10 min read
