jychoi118 / P2-weighting

CVPR 2022
MIT License
142 stars 14 forks source link

About the re-weighted loss #5

Open thuxmf opened 1 year ago

thuxmf commented 1 year ago

https://github.com/jychoi118/P2-weighting/blob/3ea1470e59eb4f4f37a5ecc41edbc9e2e626905b/guided_diffusion/gaussian_diffusion.py#L818 I found that you use the weight to multiply on the final loss, in which the denominator is greater than 1 since self.p2_k >= 1 and self.snr > 0 . Therefore, weight is smaller than 1. I wonder how to achieve the result that the total weights of your method is greater than the baseline of DDPM when the SNR is in the interval [1e-2, 1e0]?

截屏2022-12-30 14 43 36
jychoi118 commented 1 year ago

Above plot shows the normalized weights, where the sum of the weights is 1. The plot without normalization is shown in the appendix. Here, weights are smaller than the baseline as you expected. image

thuxmf commented 1 year ago

Thanks!

HaoLyou commented 1 year ago

Why is the numerator 1 in the code, not the lambda mentioned in the paper?

https://github.com/jychoi118/P2-weighting/blob/3ea1470e59eb4f4f37a5ecc41edbc9e2e626905b/guided_diffusion/gaussian_diffusion.py#L818

I found that you use the weight to multiply on the final loss, in which the denominator is greater than 1 since self.p2_k >= 1 and self.snr > 0 . Therefore, weight is smaller than 1. I wonder how to achieve the result that the total weights of your method is greater than the baseline of DDPM when the SNR is in the interval [1e-2, 1e0]?

截屏2022-12-30 14 43 36