guoyww / AnimateDiff

Official implementation of AnimateDiff.
https://animatediff.github.io
Apache License 2.0
10.55k stars 871 forks source link

Why does unet_use_temporal_attention and unet_use_temporal_attentionis are always None or False? #280

Open liutaocode opened 9 months ago

liutaocode commented 9 months ago

Why does unet_use_temporal_attention and unet_use_temporal_attention are always None or False? It seems not woking in temporal attention. Does anyone know about this? Thx!

although-not-but commented 8 months ago

I also have the same question for it

moegi161 commented 2 months ago

I am confused as there is a temporal attention part in the attention blocks, and there is also a motion module implemented differently and applied after the attention blocks. I guess the motion module is the one they mentioned in the paper (as it is a self attention) and the temporal attention in the attention blocks, which seems to be cross attention, are not used