barbararoessle / dense_depth_priors_nerf

Dense Depth Priors for Neural Radiance Fields from Sparse Input Views
MIT License
378 stars 49 forks source link

Could you provide the pretrained depth net on Matterport3D? #7

Closed cwchenwang closed 1 year ago

barbararoessle commented 2 years ago

Here is a model trained on Matterport3D. validation houses: yqstnuAEVhm, Z6MFQCViBuw, ZMojNkEp431, zsNo4HB9uLZ, XcA2TqTSSAj test houses: VzqfbhrpDEA, Vvot9Ly1tCj, YFuZgdQ5vWj, YVUC4YcDtcY, YmJkqBEsHnH training houses: all remaining houses

The network input, RGB and sparse depth, are normalized in the same way as for ScanNet: https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L683 https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L670 And the network output, depth and uncertainty, is converted back to meters like: https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L686

For best performance, the depth prior training should consider the scenario that you plan to use the depth priors in (e.g. sparse depth density and accuracy). Details can be found here

cwchenwang commented 2 years ago

@barbararoessle You should provide pretrain models and the train / test split on matterport3d for future works to compare.