The Structural Origin of Attention Sink: Variance Discrepancy, Super Neurons, and Dimension Disparity
概要
arXiv:2605.06611v1 Announce Type: cross Abstract: Despite the prevalence of the attention sink phenomenon in Large Language Models (LLMs), where initial tokens disproportionately monopolize attention scores, its structural origins remain elusive. This work provides a \textit{mechanistic explanation…