Closed agneet42 closed 1 month ago
Hi, thanks for the great work!
As mentioned in the paper, you employ full 3D attention instead of factorized spatial and temporal attention. I was wondering if this is the LOC that refers to the same : https://github.com/jy0205/Pyramid-Flow/blob/main/pyramid_dit/modeling_mmdit_block.py#L396 ; if not would you be able to point me to the same?
Yes, this class is full-sequence attention.
Thanks!
Hi, thanks for the great work!
As mentioned in the paper, you employ full 3D attention instead of factorized spatial and temporal attention. I was wondering if this is the LOC that refers to the same : https://github.com/jy0205/Pyramid-Flow/blob/main/pyramid_dit/modeling_mmdit_block.py#L396 ; if not would you be able to point me to the same?