ashawkey / stable-dreamfusion

Text-to-3D & Image-to-3D & Mesh Exportation with NeRF + Diffusion.
Apache License 2.0
8.23k stars 725 forks source link

How can I modify gradient backward process? #113

Open yijicheng opened 1 year ago

yijicheng commented 1 year ago

Thanks for your excellent work!

I am confused about how latents.backward(gradient=grad, retain_graph=True) computes the loss.

And if I want to add another loss on SDS loss, how can I modify this backward function?

ashawkey commented 1 year ago

@yijicheng Hi, you could add extra loss here.

gaoalexander commented 1 year ago

Adding to this question:

On this line there is a comment that says # manually backward, since we omitted an item in grad and cannot simply autodiff.

Could you clarify what this means? What is omitted exactly? i.e. why are we unable to call grad.loss() directly?

Thank you!

ashawkey commented 1 year ago

@gaoalexander Hi, please check the original paper's SDS loss part. The gradient through diffusion model's u-net is omitted.