Closed jasonli0707 closed 8 months ago
The current released version does not support FlashAttention. But I have already made it compatible with FlashAttention-2 (and many other augmentations). It will be released in a few days.
I see, thanks! looking forward!
Environments:
It seems that the current version does not support Flash Attention V2. I encountered the following errors when running
minimal.py
withattn_implementation="flash_attention_2"
.