baegwangbin / surface_normal_uncertainty

[ICCV 2021 Oral] Estimating and Exploiting the Aleatoric Uncertainty in Surface Normal Estimation
MIT License
228 stars 22 forks source link

Using NLL_ours as the loss function results in a negative loss #10

Open Taxalfer opened 11 months ago

Taxalfer commented 11 months ago

I wanted to try to train a new model using my own dataset, and when using NLL_ours as the Loss function, the loss value would gradually become negative during training. While training is normal when using L2 or AL, I don't know how to solve it. Looking forward to your reply.

Genshin-Impact-king commented 11 months ago

Hello,I also want to train a new model using my own dataset.But I noticed that "--dataset_name" only has nyu/scannet,I wonder that if I should load my data with /data/dataloader_custom.py? Or if there are any other steps I should take?Thank you.

Taxalfer commented 11 months ago

/data/dataloader_custom.py is used to load data when you use test.py, if you want to train your own data, you may need to write a new dataloader

baegwangbin commented 11 months ago

Hi, very sorry for the delayed response.

For NLL_ours, it is natural that the loss becomes negative. The likelihood can be higher than 1 and the loss (negative log likelihood) can thus be smaller than 0. There is nothing to worry about.

For custom datasets, you need to write your own dataloader, as different datasets have different format (e.g. for GT surface normals).