arXiv cs.AI by Synapse Flow 編集部

S2O: Early Stopping for Sparse Attention via Online Permutation

概要

arXiv:2602.22575v2 Announce Type: replace-cross Abstract: Attention scales quadratically with sequence length, fundamentally limiting long-context inference. Existing block-granularity sparsification can reduce latency, but coarse blocks impose an intrinsic sparsity ceiling, making further improvem…

元記事を読む →

関連記事