Dao-AILab / flash-attention

Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
13.62k stars 1.25k forks source link

Relative postitions #946

Open Sunnikickback opened 4 months ago

Sunnikickback commented 4 months ago

Hello, i want to implement flash attention for wavlm, where relative positions are used, i saw an issue, where somebody said it is not supported yet. So question is the same, is it now supported, or i can not to it right now?

tridao commented 4 months ago

It is not supported yet.