openai / consistency_models

Official repo for consistency models.
MIT License
6.02k stars 409 forks source link

I use the pre-trained model to generate pictures, the effect is too bad, I don't know what's going on. #25

Open stonecropa opened 1 year ago

stonecropa commented 1 year ago

This command is python image_sample.py --batch_size 8 --training_mode consistency_distillation --sampler multistep --ts 0,67,150 --steps 151 --model_path E:\Googledownload\ct_bedroom256.pt --attention_resolutions 32,16,8 --class_cond False --use_scale_shift_norm False --dropout 0.0 --image_size 256 --num_channels 256 --num_head_channels 64 --num_res_blocks 2 --num_samples 500 --resblock_updown True --use_fp16 True --weight_schedule uniform。 The resulting effect is shown in Fig. image

I don't know why this happens, is there any good way, thanks.

SherlockJane commented 1 year ago

I met the same bed result as you. May I ask if you change this term: attention_type="default" in unet.py ?

stonecropa commented 1 year ago

@SherlockJane YES

aarontan-git commented 1 year ago

I met the same bed result as you. May I ask if you change this term: attention_type="default" in unet.py ?

I got significantly better image quality after I used attention_type="flash"

SherlockJane commented 1 year ago

I met the same bed result as you. May I ask if you change this term: attention_type="default" in unet.py ?

I got significantly better image quality after I used attention_type="flash"

Thanks. However, I was unable to use this option attention_type="flash" on the v100. I'm looking for other ways.

stonecropa commented 1 year ago

@SherlockJane
I have successfully generated good images with. Download this flash-attn ==1.0.2

treefreq commented 1 year ago

@SherlockJane I have successfully generated good images with. Download this flash-attn ==1.0.2 i use flash-attn ==1.0.2 ,and attention_type="default" ,and my GPU is V100 ,but i do not get good images. what is your gpu?

stonecropa commented 1 year ago

attention_type="flash",

SherlockJane commented 1 year ago

@treefreq v100 and I cannot use attention_type="flash" this option.

ayushtues commented 1 year ago

Added a pr #37 to get good quality without using flash Attention