lucidrains / x-transformers

A concise but complete full-attention transformer with a set of promising experimental features from various papers
MIT License
4.63k stars 395 forks source link

Feature Request: Hyena Attention #140

Closed vvvm23 closed 1 year ago

vvvm23 commented 1 year ago

I saw you recently added Flash Attention to the repository, would you consider adding a relative of it – Hyene Hierarchy – to the repository? It comes from the same research group as FlashAttention, and has a nice standalone implementation here