HKUST-3DV / DIM-SLAM

This is official repo for ICLR 2023 Paper "DENSE RGB SLAM WITH NEURAL IMPLICIT MAPS"
197 stars 11 forks source link

Depth smooth loss #4

Closed zjulabwjt closed 1 year ago

zjulabwjt commented 1 year ago

I am confused about the depth smooth loss.I found in your now code use the fake_depth_loss fake_depth_loss = torch.nn.functional.smooth_l1_loss( depth, torch.ones_like(depth) * 1.5, beta=0.1, reduction="sum") what it means in the code.Besides,I can't find the smooth loss in the paper. image

poptree commented 1 year ago

Hi, I will add it to the final version if needed. During reimplementation, I found smooth term only has a minor effect. Thus I didn't include it in this version.

Best, Heng

poptree commented 1 year ago

The fake depth loss provides an initialization on depth to prevent the optimization trap into a singular solution or local minima. It is only used in the first 150 iterations of the initialization. If you have another initialization method, feel free to replace the fake depth loss with it, and see the effectiveness.

poptree commented 1 year ago

Let me know if you still have any questions.

Best, Heng

zjulabwjt commented 1 year ago

Let me know if you still have any questions.

Best, Heng

Thanks for your reply!And when will you open source the fully code?

poptree commented 1 year ago

Maybe ~June, depending on my workload.

Best