Closed phongnhhn92 closed 1 year ago
Hi! Depth sensing is notoriously noisy around object edges (check Kinect, for example). These regions are marked as invalid by the zero values.
You can find our data processing code using a sobel filter for this purpose: https://github.com/KAIR-BAIR/dycheck/blob/7d336d2d3ef72dcaa7a687241e85e788a837c84a/dycheck/datasets/iphone.py#L282-L284 Additionally, those invalid depth should not contribute to the training process: https://github.com/KAIR-BAIR/dycheck/blob/7d336d2d3ef72dcaa7a687241e85e788a837c84a/dycheck/core/losses/depth.py#L27-L34
We have experimented with training without sobel filtering -- noisy depth is pretty harmful for the final results.
Closing it! Let me know if you have any questions.
Hi,
I am visualizing the depth maps of the Iphone dataset and I found that there are a lot of pixels on the edge of the objects doesnt have any depth values. Would you mind explaining why the depth obtained from Lidar is like that ? Thanks !