Huage001 / LinFusion

Official PyTorch and Diffusers Implementation of "LinFusion: 1 GPU, 1 Minute, 16K Image"
Apache License 2.0
247 stars 17 forks source link

Confusion about the " batch_to_head_dim" function #9

Closed YecanLee closed 1 month ago

YecanLee commented 1 month ago

Hi, thank you for your amazing work! I read the whole paper and I really got inspired a lot!

I have a little bit confusion here regarding the line76 inside the file src/linfusion/attention.py, it seems like it is using a function called self.batch_to_head_dim. Same thing happens again in the file src/distrifuser/modules/pp/attn.py at line 247. May I ask is this a typo for the function head_to_batch_dim or there is a missing import at those two places?

YecanLee commented 1 month ago

Sorry I found it in the souce codes of the diffusers, thank you again for your amazing work!