Transformers

Transformer

Introduced by Vaswani et al. in Attention Is All You Need

A Transformer is a model architecture that eschews recurrence and instead relies entirely on an attention mechanism to draw global dependencies between input and output. Before Transformers, the dominant sequence transduction models were based on complex recurrent or convolutional neural networks that include an encoder and a decoder. The Transformer also employs an encoder and decoder, but removing recurrence in favor of attention mechanisms allows for significantly more parallelization than methods like RNNs and CNNs.

Source: Attention Is All You Need

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 50 6.98%
Semantic Segmentation 27 3.77%
Large Language Model 23 3.21%
Question Answering 20 2.79%
Object Detection 19 2.65%
Image Classification 16 2.23%
In-Context Learning 14 1.96%
Sentence 12 1.68%
Image Segmentation 11 1.54%

Categories