Here is a model trained on Matterport3D.
validation houses: yqstnuAEVhm, Z6MFQCViBuw, ZMojNkEp431, zsNo4HB9uLZ, XcA2TqTSSAj
test houses: VzqfbhrpDEA, Vvot9Ly1tCj, YFuZgdQ5vWj, YVUC4YcDtcY, YmJkqBEsHnH
training houses: all remaining houses
For best performance, the depth prior training should consider the scenario that you plan to use the depth priors in (e.g. sparse depth density and accuracy). Details can be found here
Here is a model trained on Matterport3D. validation houses:
yqstnuAEVhm
,Z6MFQCViBuw
,ZMojNkEp431
,zsNo4HB9uLZ
,XcA2TqTSSAj
test houses:VzqfbhrpDEA
,Vvot9Ly1tCj
,YFuZgdQ5vWj
,YVUC4YcDtcY
,YmJkqBEsHnH
training houses: all remaining housesThe network input, RGB and sparse depth, are normalized in the same way as for ScanNet: https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L683 https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L670 And the network output, depth and uncertainty, is converted back to meters like: https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L686
For best performance, the depth prior training should consider the scenario that you plan to use the depth priors in (e.g. sparse depth density and accuracy). Details can be found here