kaiwang960112 / SpeeD

SpeeD: A Closer Look at Time Steps is Worthy of Triple Speed-Up for Diffusion Model Training
Apache License 2.0
124 stars 3 forks source link

Related works #3

Closed byeongjun-park closed 1 month ago

byeongjun-park commented 1 month ago

Hi, thank you for sharing your work!

I am also interested in similar works. Are you planning to conduct experiments such as the comparison with ANT [1] or the compatibility with DTR [2]?

[1] Addressing Negative Transfer in Diffusion Models, NeurIPS 2023. [2] Denoising Task Routing for Diffusion Models, ICLR 2024.

gohyojun15 commented 1 month ago

Hello! SpeeD team!

Thank you for making your interest project open source! As @byeongjun-park suggested, those works are so similar to SpeeD, since they aim to improve both convergence speed and absolute performance through loss reweighting and architectures. Also, those works rethink diffusion training as multi-task learning of denoising tasks across timesteps, harmonizing your work's findings in closer looks in timesteps.

I hope your positive responses!

Sincerely

kaiwang960112 commented 1 month ago

Hi Byeongjun and Hyojun. Thanks a lot for your message and introduction of your impressive works. We will discuss your papers in the next version. Meanwhile, we are going to explore compatibility with the mentioned papers. I believe that the experiments will enhance our research! Thanks again!

Sincerely, Kai

byeongjun-park commented 1 month ago

Hello Kai,

Thank you for your prompt response. We are excited about the possibility of our work contributing to your incredible project. If you have any questions or need further information related to our work, please feel free to contact us.

Wishing you a wonderful weekend.

Sincerely, Byeongjun

kaiwang960112 commented 1 month ago

Hi Byeongjun, Thanks a lot. Yeah, for sure, we will approach you to seek help if we have some issues. Happy weekend!

Sincerely, Kai