General • 73 methods
Activation functions are functions that we apply in neural networks after (typically) applying an affine transformation combining weights and input features. They are typically non-linear functions. The rectified linear unit, or ReLU, has been the most popular in the past decade, although the choice is architecture dependent and many alternatives have emerged in recent years. In this section, you will find a constantly updating list of activation functions.
Subcategories
Method | Year | Papers |
---|---|---|
2000 | 9398 | |
2016 | 8056 | |
2000 | 6231 | |
2000 | 5642 | |
2014 | 1175 | |
2016 | 622 | |
2017 | 325 | |
2000 | 236 | |
2017 | 225 | |
2019 | 208 | |
2015 | 99 | |
2017 | 77 | |
2019 | 73 | |
2020 | 54 | |
2023 | 49 | |
2013 | 48 | |
2015 | 36 | |
2000 | 33 | |
2017 | 19 | |
2017 | 14 | |
2018 | 12 | |
2000 | 12 | |
2021 | 10 | |
2016 | 9 | |
2020 | 7 | |
2017 | 7 | |
2020 | 6 | |
2020 | 5 | |
2015 | 5 | |
2000 | 5 | |
2020 | 4 | |
2015 | 4 | |
2020 | 3 | |
2015 | 3 | |
2021 | 3 | |
2019 | 3 | |
2023 | 2 | |
2015 | 2 | |
1994 | 2 | |
2019 | 2 | |
2019 | 2 | |
2022 | 2 | |
2019 | 2 | |
2021 | 2 | |
2020 | 2 | |
2016 | 2 | |
2018 | 2 | |
2019 | 1 | |
2023 | 1 | |
2018 | 1 | |
2023 | 1 | |
2021 | 1 | |
2020 | 1 | |
2000 | 1 | |
2023 | 1 | |
2021 | 1 | |
2022 | 1 | |
2023 | 1 | |
2023 | 1 | |
2000 | 1 | |
2021 | 1 | |
2000 | 1 | |
2018 | 1 | |
2020 | 1 | |
2020 | 1 | |
2018 | 1 | |
2023 | 1 | |
2022 | 1 | |
2022 | 1 | |
1998 | 0 | |
2020 | 0 | |
2000 | 0 |