baegwangbin / surface_normal_uncertainty

[ICCV 2021 Oral] Estimating and Exploiting the Aleatoric Uncertainty in Surface Normal Estimation
MIT License
217 stars 22 forks source link

Training loss is negative. #7

Closed guangkaixu closed 1 year ago

guangkaixu commented 1 year ago

Hi, thanks so much for your excellent work on the surface normal estimation. When I was training on the taskonomy dataset with the paper-proposed loss function 'UG_NLL_ours' (i.e., simply replace the img and norm paths of the Nyuloader), the loss tended to decrease and even be negative. Is it okay for training? Or there may exist some mistakes.

baegwangbin commented 1 year ago

Hi, yes this is very natural. Probability density, unlike probability mass, can be higher than 1, which would make the loss negative.

guangkaixu commented 1 year ago

Oh, I understand. Thank you very much for the pretty work and fast response!