Open HIT-LiuChen opened 1 year ago
I have the same doubt, what's the reason for doing this ?
俺也有这个疑问,方差不是应该是self.posterior_var吗
Have you solved the problem? Can you fill me in?
@zoubohao Could you answer this issue? Thank you!
In the paper DDPM, it seems that var is equal to posterior_var or betas. And I don't know why they were concatenated here?
In
Diffusion.py
line 66,posterior_var
is calculated. Why do you usetorch.cat([self.posterior_var[1:2], self.betas[1:]])
in line 77 to extract variance instead ofposterior_var
.