Fayeben / GenerativeDiffusionPrior

Generative Diffusion Prior for Unified Image Restoration and Enhancement (CVPR2023)
Apache License 2.0
274 stars 30 forks source link

Testing the model with DDIM #29

Open hueledao opened 1 year ago

hueledao commented 1 year ago

Hello Thank you for your work. I successfully ran your code for the inpainting task with 1000 diffusion steps. I wonder how we test the model with 20 denoising steps by DDIM?

I used this command line "python sample_x0_inp.py --attention_resolutions 32,16,8 --class_cond False --image_size 256 --learn_sigma True --noise_schedule linear --num_channels 256 --num_head_channels 64 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True --use_img_for_guidance --start_from_scratch --save_png_files --diffusion_steps 20 --use_ddim True"

but I received "nan" like this "step t 19 img guidance has been used, mse is 0.25376278 4000 = 1015.05 step t 18 img guidance has been used, mse is nan 4000 = nan step t 17 img guidance has been used, mse is nan * 4000 = nan"

I have no idea what I got wrong. I hope you guys can help me Thank you very much!

Fayeben commented 1 year ago

Hi, since we are busy with our project, we are sorry for the late reply. When using DDIM, the guidance scale needs to be adjusted. If mse loss is nan, you can use a smaller guidance scale.

hueledao commented 1 year ago

Thank you for your responses. Could I ask you which guidance scale you used for inpainting when using DDIM? Thank you very much. Vào CN, 8 the 10, 2023 lúc 16:15 Fayeben @.***> đã viết:

Hi, since we are busy with our project, we are sorry for the late reply. When using DDIM, the guidance scale needs to be adjusted. If mse loss is nan, you can use a smaller guidance scale.

— Reply to this email directly, view it on GitHub https://github.com/Fayeben/GenerativeDiffusionPrior/issues/29#issuecomment-1751958043, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJ57OWHDXKUSHF23FTI6HMDX6JOLZAVCNFSM6AAAAAA45WDJP6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONJRHE2TQMBUGM . You are receiving this because you authored the thread.Message ID: @.***>

hueledao commented 1 year ago

Hi, since we are busy with our project, we are sorry for the late reply. When using DDIM, the guidance scale needs to be adjusted. If mse loss is nan, you can use a smaller guidance scale.

I tried with smaller guidance scales (10, 100, 1000) and received the same problem "nan" result. I changed diffusion_steps from 20 to 200, it can work but the final results are almost noise, it can't generate anything. Maybe my command line is wrong. Could I ask you about the correct command line to use DDIM with a small diffusion step? Thank you very much.

This is my command line "python sample_x0_inp.py --attention_resolutions 32,16,8 --class_cond False --image_size 256 --learn_sigma True --noise_schedule linear --num_channels 256 --num_head_channels 64 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True --use_img_for_guidance --start_from_scratch --save_png_files --use_ddim True --diffusion_steps 200 --img_guidance_scale 100"