Separius / awesome-fast-attention

list of efficient attention modules
GNU General Public License v3.0
990 stars 108 forks source link

TaLK Convolutions #3

Closed lioutasb closed 3 years ago

lioutasb commented 3 years ago

I'm the author of "Time-aware Large Kernel Convolutions" (https://arxiv.org/abs/2002.03184) which is an alternative method to self-attention with linear complexity published in ICML 2020. You can find the implementation here (https://github.com/lioutasb/TaLKConvolutions). Thanks a lot.

Separius commented 3 years ago

Thanks, I will add it as soon as possible (less than a week)