Closed wtomin closed 3 months ago
FlashAttention in MindSpore only accepts mask with (0, 1), where 0 means to keep, and 1 means to discard.
Editing the current Opensora-PKU attention mask:
-ms.numpy.inf
LatteT2V.construct
1 - mask
Minor changes:
Use FA
sample_t2v.py
npz
sample_text_embed.py
FlashAttention in MindSpore only accepts mask with (0, 1), where 0 means to keep, and 1 means to discard.
Editing the current Opensora-PKU attention mask:
-ms.numpy.inf
inLatteT2V.construct
;1 - mask
since FA treats 1 as discard, 0 as retain.-ms.numpy.inf
in vanilla Attention, and leave the mask untouched for FAMinor changes:
Use FA
insample_t2v.py
;npz
insample_text_embed.py
.