General • 125 methods
Attention is a technique for attending to different parts of an input vector to capture long-term dependencies. Within the context of NLP, traditional sequence-to-sequence models compressed the input sequence to a fixed-length context vector, which hindered their ability to remember long inputs such as sentences. In contrast, attention creates shortcuts between the context vector and the entire source input. Below you will find a continuously updating list of attention based building blocks used in deep learning.
Subcategories
Method | Year | Papers |
---|---|---|
2017 | 17290 | |
2017 | 17245 | |
2019 | 1324 | |
2019 | 1323 | |
2017 | 258 | |
2015 | 218 | |
2017 | 214 | |
2014 | 195 | |
2018 | 176 | |
2021 | 162 | |
2017 | 160 | |
2018 | 135 | |
2022 | 112 | |
2014 | 104 | |
2020 | 84 | |
2021 | 81 | |
2020 | 78 | |
2020 | 77 | |
2020 | 71 | |
2020 | 68 | |
2019 | 66 | |
2020 | 65 | |
2018 | 65 | |
2019 | 52 | |
2019 | 47 | |
2021 | 44 | |
2020 | 43 | |
2020 | 37 | |
2018 | 37 | |
2017 | 36 | |
2019 | 36 | |
2015 | 33 | |
2014 | 32 | |
2021 | 32 | |
2020 | 32 | |
2015 | 31 | |
2015 | 24 | |
2018 | 24 | |
2020 | 24 | |
2019 | 22 | |
2022 | 22 | |
2021 | 22 | |
2021 | 21 | |
2019 | 19 | |
2015 | 19 | |
2020 | 19 | |
2018 | 19 | |
2020 | 17 | |
2020 | 17 | |
2018 | 15 | |
2020 | 14 | |
2022 | 13 | |
2020 | 11 | |
2021 | 11 | |
2019 | 11 | |
2017 | 9 | |
2018 | 9 | |
2019 | 9 | |
2023 | 8 | |
2020 | 7 | |
2019 | 7 | |
2021 | 7 | |
2019 | 7 | |
2019 | 7 | |
2020 | 7 | |
2017 | 7 | |
2015 | 6 | |
2019 | 6 | |
2019 | 6 | |
2018 | 6 | |
2019 | 5 | |
2021 | 5 | |
2021 | 4 | |
2018 | 4 | |
2020 | 4 | |
2020 | 4 | |
2021 | 3 | |
2018 | 3 | |
2016 | 3 | |
2020 | 3 | |
2020 | 3 | |
2021 | 3 | |
2021 | 3 | |
2020 | 3 | |
2015 | 3 | |
2018 | 3 | |
2019 | 2 | |
2020 | 2 | |
2018 | 2 | |
2 | ||
2021 | 2 | |
2020 | 2 | |
2021 | 2 | |
2017 | 2 | |
2021 | 2 | |
2017 | 2 | |
2016 | 2 | |
2021 | 2 | |
2019 | 2 | |
2020 | 2 | |
2021 | 2 | |
2021 | 2 | |
2020 | 1 | |
2020 | 1 | |
2018 | 1 | |
2018 | 1 | |
2020 | 1 | |
2022 | 1 | |
2016 | 1 | |
2022 | 1 | |
2020 | 1 | |
2021 | 1 | |
2021 | 1 | |
2023 | 1 | |
2020 | 1 | |
2021 | 1 | |
2022 | 1 | |
2020 | 1 | |
2019 | 1 | |
2021 | 1 | |
2020 | 1 | |
2019 | 1 | |
2020 | 1 | |
2021 | 1 | |
2020 | 1 | |
2020 | 1 | |
2000 | 0 |