xiefan-guo / initno

[CVPR 2024] InitNO: Boosting Text-to-Image Diffusion Models via Initial Noise Optimization
https://xiefan-guo.github.io/initno
Apache License 2.0
23 stars 0 forks source link

endless loop #2

Open Liming1013 opened 1 month ago

Liming1013 commented 1 month ago

This code runs in an endless loop at times

DimitriosKakouris commented 2 weeks ago

I second this, I tried the prompt "a dog and a tiger" with seed 50 and as the optimization goes on it does not reach optimal noise in the first 5 10 (meaning Tmax_round Tmax_step ). So it goes on to do fn_initno for 50 steps and it just endlessly loops there.

xiefan-guo commented 2 weeks ago

I'm sorry for the delayed response. We tried the prompt "a dog and a tiger" with seed of 50 but did not encounter the issue you mentioned. Here is the output and log: a dog and a tiger_50

log

I hope this has been helpful.

DimitriosKakouris commented 2 weeks ago

Hello, firstly thank you very much for you response. I forgot to mention earlier that I am using the stable-diffusion-2-1. I patched your code based on this Attend-And-Excite commit https://github.com/yuval-alaluf/Attend-and-Excite/commit/15c30b1126af2b80a4142e4df9855e7dc480a83d which adds support for 2.1 based on this issue https://github.com/yuval-alaluf/Attend-and-Excite/issues/14, basically changing the slicing of the cross attention maps in fn_compute _loss() like this: https://github.com/yuval-alaluf/Attend-and-Excite/blob/163efdfd341bf3590df3c0c2b582935fbc8e8343/pipeline_attend_and_excite.py#L198C8-L208C1

That would explain why you get a clean execution when you try the prompt and seed. I do 20 max_iter_to_alter (chose half of inference steps) because I later do 40 inference steps. I am using the code you provide until I get the optimized latents, which later I utilize in my own code. I acknowledge the fact that your paper states that you use v1 of SD. I am just wondering what would cause the endless loop in the initial noise optimization. If there is anything I missed or a fault in my thought process please do not hesitate to point it out.