Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.
Subcategories
Method | Year | Papers |
---|---|---|
2017 | 17290 | |
2019 | 1324 | |
2019 | 1323 | |
2017 | 258 | |
2015 | 218 | |
2017 | 214 | |
2014 | 195 | |
2017 | 160 | |
2022 | 112 | |
2014 | 104 | |
2020 | 84 | |
2020 | 78 | |
2020 | 77 | |
2020 | 71 | |
2019 | 66 | |
2020 | 65 | |
2019 | 52 | |
2021 | 44 | |
2020 | 43 | |
2018 | 37 | |
2017 | 36 | |
2015 | 33 | |
2014 | 32 | |
2021 | 32 | |
2015 | 31 | |
2015 | 24 | |
2020 | 24 | |
2018 | 24 | |
2019 | 22 | |
2021 | 21 | |
2019 | 19 | |
2015 | 19 | |
2020 | 19 | |
2020 | 17 | |
2018 | 15 | |
2020 | 14 | |
2022 | 13 | |
2020 | 11 | |
2019 | 9 | |
2018 | 9 | |
2019 | 7 | |
2020 | 7 | |
2019 | 6 | |
2018 | 6 | |
2015 | 6 | |
2021 | 5 | |
2018 | 4 | |
2020 | 4 | |
2018 | 3 | |
2016 | 3 | |
2015 | 3 | |
2020 | 3 | |
2017 | 2 | |
2020 | 2 | |
2016 | 2 | |
2018 | 2 | |
2021 | 2 | |
2020 | 2 | |
2017 | 2 | |
2021 | 2 | |
2019 | 2 | |
2020 | 2 | |
2021 | 2 | |
2020 | 1 | |
2021 | 1 | |
2021 | 1 | |
2020 | 1 | |
2018 | 1 | |
2018 | 1 | |
2020 | 1 | |
2022 | 1 | |
2016 | 1 | |
2022 | 1 | |
2020 | 1 | |
2021 | 1 | |
2021 | 1 | |
2020 | 1 | |
2020 | 1 | |
2000 | 0 |