detalle del documento
IDENTIFICACIÓN

oai:arXiv.org:2406.00427

Tema
Computer Science - Computer Vision...
Autor
Zhang, Shuoxi Liu, Hanpeng Lin, Stephen He, Kun
Categoría

Computer Science

Año

2024

fecha de cotización

5/6/2024

Palabras clave
computer vits vision attention
Métrico

Resumen

The advent of Vision Transformers (ViTs) marks a substantial paradigm shift in the realm of computer vision.

ViTs capture the global information of images through self-attention modules, which perform dot product computations among patchified image tokens.

While self-attention modules empower ViTs to capture long-range dependencies, the computational complexity grows quadratically with the number of tokens, which is a major hindrance to the practical application of ViTs.

Moreover, the self-attention mechanism in deep ViTs is also susceptible to the attention saturation issue.

Accordingly, we argue against the necessity of computing the attention scores in every layer, and we propose the Less-Attention Vision Transformer (LaViT), which computes only a few attention operations at each stage and calculates the subsequent feature alignments in other layers via attention transformations that leverage the previously calculated attention scores.

This novel approach can mitigate two primary issues plaguing traditional self-attention modules: the heavy computational burden and attention saturation.

Our proposed architecture offers superior efficiency and ease of implementation, merely requiring matrix multiplications that are highly optimized in contemporary deep learning frameworks.

Moreover, our architecture demonstrates exceptional performance across various vision tasks including classification, detection and segmentation.

;Comment: CVPR 2024 Camera-Ready; 10 pages, 3 figures

Zhang, Shuoxi,Liu, Hanpeng,Lin, Stephen,He, Kun, 2024, You Only Need Less Attention at Each Stage in Vision Transformers

Documento

Abrir

Compartir

Fuente

Artículos recomendados por ES/IODE IA

High-Frequency Repetitive Magnetic Stimulation at the Sacrum Alleviates Chronic Constipation in Parkinson’s Patients
magnetic stimulation parkinson’s significant patients scale sacrum pd hf-rms chronic constipation scores
The mechanism of PFK-1 in the occurrence and development of bladder cancer by regulating ZEB1 lactylation
bladder cancer pfk-1 zeb1 lactylation glycolysis inhibits lactate glucose bc pfk-1 cancer lactylation cells bladder