Open yijicheng opened 1 year ago
Adding to this question:
On this line there is a comment that says
# manually backward, since we omitted an item in grad and cannot simply autodiff.
Could you clarify what this means? What is omitted exactly? i.e. why are we unable to call grad.loss() directly?
Thank you!
@gaoalexander Hi, please check the original paper's SDS loss part. The gradient through diffusion model's u-net is omitted.
Thanks for your excellent work!
I am confused about how
latents.backward(gradient=grad, retain_graph=True)
computes the loss.And if I want to add another loss on SDS loss, how can I modify this backward function?