Open liutaocode opened 9 months ago
I also have the same question for it
I am confused as there is a temporal attention part in the attention blocks, and there is also a motion module implemented differently and applied after the attention blocks. I guess the motion module is the one they mentioned in the paper (as it is a self attention) and the temporal attention in the attention blocks, which seems to be cross attention, are not used
Why does unet_use_temporal_attention and unet_use_temporal_attention are always None or False? It seems not woking in temporal attention. Does anyone know about this? Thx!