First Survey of Attention Sink: From Utilization and Understanding to Elimination in Transformers

This survey reviews over 180 papers on the Attention Sink phenomenon in Transformers, outlining its three-stage evolution—from early exploitation to mechanistic interpretation and finally strategic mitigation—while detailing utilization tactics, theoretical explanations, removal techniques, and promising future research directions.

Machine Heart
Machine Heart
Machine Heart
First Survey of Attention Sink: From Utilization and Understanding to Elimination in Transformers

Definition

Almost all Transformer models allocate a disproportionate amount of attention to a few specific tokens (e.g., the first token, [SEP], or background image patches). This inherent behavior is termed Attention Sink and is not a bug.

Research Evolution

Early stage (2023‑onwards) : Researchers treat the sink as a usable feature or a factor to accommodate.

Middle stage (2024‑onwards) : After empirical usage stabilises, the focus shifts to uncovering the underlying mechanisms and improving interpretability.

Recent stage (2025‑onwards) : With mechanistic insights, work moves toward systematic architectural removal of the sink.

Utilization Strategies

Sink Token Preservation : Keep the sink token as a permanent attention anchor, stabilising attention distribution during model compression.

Attention Redistribution : Actively identify sink tokens and reallocate their attention weight to semantically meaningful tokens.

Learnable Prefix Tokens : Insert trainable prefix tokens before the input sequence, providing an explicit, controllable alternative to naturally occurring sinks.

Sink Token Repurposing : Exploit the sink’s stable, high‑attention property for specialised tasks such as adversarial injection or defensive detection.

Mechanistic Explanations

Softmax Limitations & No‑Op Theory : The softmax constraint forces the model to allocate probability mass even when queries are unrelated to any key, causing attention to collapse onto semantically irrelevant tokens and effectively implement a no‑op.

Outlier Circuits : Systematic outlier values within the model interact to generate sink behaviour.

Implicit Attention Bias : Sink tokens contribute an almost constant amount to every query, acting as a fixed bias term.

Geometric Anchoring : In high‑dimensional representation space, sink tokens serve as stable reference points that anchor and stabilise the space.

Mitigation Techniques

Based on the above mechanisms, researchers propose two broad categories of removal methods:

Explicit Substitutes : Introduce alternatives that make the sink unnecessary, e.g., Gated Attention (learnable gates that close when a no‑op is needed) and Learnable Attention Bias (explicit bias parameters replacing the implicit sink).

Causal Disruption : Alter the root cause, including Modified Softmax Functions that relax the sum‑to‑one constraint and Pre‑training Interventions that suppress sink formation during training without architectural changes.

Additional techniques mentioned are Outlier‑Driven Rescaling and Architectural Isolation.

Future Directions

Efficient Lightweight Processing : Design low‑latency attention redistribution and softmax variants compatible with fast kernels to avoid sink handling becoming an inference bottleneck.

Lightweight Adaptation for Pre‑trained Models : Inject sink‑suppression capability into existing models via parameter‑efficient transfer, avoiding costly retraining.

Emerging Architectures Exploration : Study sink behaviour in hybrid linear attention, 3D Transformers, and other novel designs.

Other avenues include training‑dynamics studies, unified theoretical frameworks, standardized benchmarks, cross‑architecture transfer, and multi‑technique integration.

Resources

Paper: https://arxiv.org/abs/2604.10098

GitHub repository: https://github.com/ZunhaiSu/Awesome-Attention-Sink

TransformersMitigationModel InterpretabilitySurveyAttention Sink
Machine Heart
Written by

Machine Heart

Professional AI media and industry service platform

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.