Parskatt / DKM

[CVPR 2023] DKM: Dense Kernelized Feature Matching for Geometry Estimation
https://parskatt.github.io/DKM/
Other
378 stars 28 forks source link

what does low_res_certainty do in dkm.py? #31

Closed Moreland-cas closed 1 year ago

Moreland-cas commented 1 year ago

Great work! I wonder what is the effect of " low_res_certainty = factorlow_res_certainty(low_res_certainty < cert_clamp) ... dense_certainty = dense_certainty - low_res_certainty " in dkm/models/dkm.py? Also, shouldn't this " query_coords = torch.meshgrid( ( torch.linspace(-1 + 1 / hs, 1 - 1 / hs, hs, device=device), torch.linspace(-1 + 1 / ws, 1 - 1 / ws, ws, device=device), ) ) " be " query_coords = torch.meshgrid( ( torch.linspace(-1 + 1 /( 2hs), 1 - 1 /(2 hs), hs, device=device), torch.linspace(-1 + 1 /( 2 w)s, 1 - 1 /( 2 ws), ws, device=device), ) ) "? Thanks!

Parskatt commented 1 year ago

The first is a heuristic, we find that scale 16 is overly uncertain, so optionally we simply remove some of the uncertainty.

Second, if we take some extreme case like hs=2, we get points on -0.5,0.5. I think this is in line with "align_corners=False".

Moreland-cas commented 1 year ago

The first is a heuristic, we find that scale 16 is overly uncertain, so optionally we simply remove some of the uncertainty.

Second, if we take some extreme case like hs=2, we get points on -0.5,0.5. I think this is in line with "align_corners=False".

Thanks for the quick reply ! I'm good with the second one now. Can you explain more about the first one please? What do you mean by "overly uncertain" and how does subtraction make it better?

Parskatt commented 1 year ago

Basically we train the confidence in our model to match the MVS dataset (MegaDepth). It seems like megadepth fails quite often to provide accurate depth, hence the model learns this bias. Basically in the postprocessing we "soften" the uncertain regions, to somehow compensate for this. It's a quite heuristic approach and could be improved I think :D