mjmjeong / InfoNeRF

142 stars 7 forks source link

Request for clarification of KL Divergence Loss (a.k.a. Smooth Loss) #10

Open JoohyeokKim opened 8 months ago

JoohyeokKim commented 8 months ago

Hi, I want some clarification of KL loss (a.k.a. SmoothLoss) in your code.

According to the notation in your paper, KL loss trys to optimize the distribution of unseen rays(namely r_tilda) closer to that of seen rays(namely r). In your implementation, first and second parameter fed to KLDivLoss are r and r_tilda respectively.

https://pytorch.org/docs/1.4.0/nn.html?highlight=kldivloss#torch.nn.KLDivLoss https://pytorch.org/docs/stable/generated/torch.nn.KLDivLoss.html#torch.nn.KLDivLoss These are official document of KLDivLoss in pytorch 1.4.0 and recent version respectively.

According to the pytorch official document, second parameter becomes target and the first parameter becomes the prediction.

It means that distribution of seen rays (r) will resemble that of unseen rays(r_tilda), which is opposite way from the direction described in the paper.

Could you clarify which is correct, the equation in the paper or that in the code?