Closed xbillowy closed 10 months ago
Hi, sorry for the late reply.
The code is correct because L and L' below have the same gradient with respect to $\theta$:
I wrote it as -p * w.detach() ** gce
simply because it's more similar to the form of cross-entropy loss.
But you can certainly change it to the form in the paper.
Thank you very much for your response, it helps a lot!
Hello! Thanks for sharing the code of your excellent work!
When I checked the implementation of the gce loss in https://github.com/cvlab-stonybrook/s-volsdf/blob/f799f5b648f477a50a0d1921f2950939f7b7d794/volsdf/model/loss.py#L63, I found it seems different from the Eq. (3) in the paper, when q = 0, it is cross entropy loss, but when 0 < q <= 1, I cannot find term $\frac{1 - w(x)^{q}}{q}$, but only
-p * w.detach() ** gce
, I would like to ask if there is a misunderstanding here, or other reasons.Thank you!