ZGCTroy / LayoutDiffusion

265 stars 19 forks source link

Relationship between classifier_free_dropout (0.2) and classifier_scale (1.0)? #15

Open shileims opened 1 year ago

shileims commented 1 year ago

hi author, I noticed that your classifier_free_dropout is 0.2, but what is the reason for setting classifier_scale to be 1.0? Thanks

ZGCTroy commented 1 year ago

classifier_scale is the scale of classifier-free guidance in inference. classifier_free_dropout means to drop condition with the probability 0.1 during training.

you can refer to the paper of Classifier-free Guidance

shileims commented 1 year ago

hi @ZGCTroy , Thanks for your reply. I mean if class_scale is 1.0, does it mean sampling doesn't use the classifer_free_guidence? but training uses the classifier_free_guidance? In terms of the code, uncond_mean is not used if classifier_free_scale is equal to 1.0.

    mean = cond_mean + cfg.sample.classifier_free_scale * (cond_mean - uncond_mean)

Thank you.

ZGCTroy commented 1 year ago

Yes. Specifically, classifier-free guidance is only a technique for condition enhancement in sampling. However, it requires both conditional model and unconditional model during training. Instead of training two models, they propose to train only one conditional model by dropouting condition with fixed probability.

shileims commented 1 year ago

Hi @ZGCTroy , Thank you so much! I just want to clarify you use clasifier-free-guidance during training, not use during sampling. Thank you so much!