Cross-Attention and Encoder-Decoder Transformers: A Logical Characterization
概要
arXiv:2605.07705v1 Announce Type: cross Abstract: We give a novel logical characterization of encoder-decoder transformers, the foundational architecture for LLMs that also sees use in various settings that benefit from cross-attention. We study such transformers over text in the practical setting …