cleinc / bts

From Big to Small: Multi-Scale Local Planar Guidance for Monocular Depth Estimation
GNU General Public License v3.0
628 stars 177 forks source link

Dense depth map gives NAN #68

Open poornimajd opened 4 years ago

poornimajd commented 4 years ago

Hello great work @cogaplex-bts and team! I am having a custom dataset with dense depth map from left images.These are not similar to the depth map of kitti data which is obtained by VLP-64 lidar.When I remove the "divide by 256" in the following lines, the model trains,else it throws a nan loss and training stops. https://github.com/cogaplex-bts/bts/blob/29d6a3211782197cfeb1a89838cc9626c3adc5f7/pytorch/bts_dataloader.py#L134 https://github.com/cogaplex-bts/bts/blob/29d6a3211782197cfeb1a89838cc9626c3adc5f7/pytorch/bts_dataloader.py#L166 Is this the way to solve the issue while giving dense maps as input? Dense map is as shown below argb_image_0 jpg

Are there any other changes to be made while using such custom data? Any suggestion is greatly appreciated!

cogaplex-bts commented 4 years ago

@poornimajd Hello. Thanks for your interest in our work. The ground truth depth maps provided by KITTI has png formats and values are multiplied by 256 in meters to utilize the pixel format range [0, 65535]. If your data has no such scaling, you are correct to remove the lines. Also, please don't forget to feed the network proper focal length values.