pablovela5620 / mini-dust3r

Other
155 stars 10 forks source link

About dust3r depthmap #4

Open puyiwen opened 1 month ago

puyiwen commented 1 month ago

Hi, I want to convert depthmap to metric depth, and evaluate on NYU dataset, how can I do that? Can you help me? Thank you very much!

pablovela5620 commented 1 month ago

Unfortunately, dust3r does not provide absolute metric depth, its only scale/shift relative based on this comment here https://github.com/naver/dust3r/issues/44#issuecomment-1999227727 You would probably need to do some depth map alignment against the NYU dataset. You can take a look at a script like this https://github.com/maturk/dn-splatter/blob/main/dn_splatter/scripts/align_depth.py

puyiwen commented 1 month ago

@pablovela5620 Thank you for your reply! The align code like MiDas to convert relative depth to metric depth. And I have another question, what are the specific applications of using only the relative 3D reconstruction and relative camera posture obtained by dust3r? Does it have to be absolute to have application scenarios? Do you have any insights? Thank you again!

pablovela5620 commented 1 month ago

Dust3r is similar to other structure from motion pipelines like Colmap which also do not provide metric estimates but only up to a scale, so you could use it for things like novel view synthesis via nerf/gsplat. Or just to have a 3d reconstruction

pablovela5620 commented 1 month ago

Another option you could look into for aligning relative depth maps + camera poses to metric scale is using like TRAM does for estimating metric scale here https://github.com/yufu-wang/tram/blob/main/lib/camera/est_scale.py https://yufu-wang.github.io/tram4d/