mindspore-lab / mindone

one for all, Optimal generator with No Exception
https://mindspore-lab.github.io/mindone/
Apache License 2.0
335 stars 63 forks source link

FlashAttention: fix compatibility #291

Closed wtomin closed 5 months ago

wtomin commented 6 months ago

After the modifications, I have run text_to_image.py on both 910A(ms2.1) and 910B(ms2.2) with SDv1.5 and SDv2.0. All the experiments worked without errors, and the generated images are of good quality.

However, the experiments only related to CrossAttention. I haven't tested with CrossFrameAttention, because I'm not sure when CrossFrameAttention is applied.

PS: I just found that SDv1.5 does not use flash-attention by default. Therefore the experiments with SDv1.5 were not meaningful.

SamitHuang commented 6 months ago

Good. The generated images by sd1.5+fa on 910b are also ok?

wtomin commented 6 months ago

Good. The generated images by sd1.5+fa on 910b are also ok?

sdv1.5 does not use flashattention by default. Therefore my exps with sdv1.5 were just using vanilla attentions.

wtomin commented 5 months ago

Because I found a better solution to this problem, this PR is closed to avoid confusion.