barbararoessle / dense_depth_priors_nerf

Dense Depth Priors for Neural Radiance Fields from Sparse Input Views
MIT License
378 stars 49 forks source link

how to use Dense Depth Prior on other datasets, such as matterport3D? #1

Closed EchoTHChen closed 2 years ago

barbararoessle commented 2 years ago

Training Depth Priors Network

We used RGB-D input to train the depth completion network. For a different dataset the training data needs to be loaded, similar to scannet_dataset.py. The network is trained on sparse depth that is sampled from the dense depth map and perturbed by Gaussian noise. For the sampling and perturbation it makes sense to consider characteristics of the sparse depth data that you plan to optimize NeRF on:

Sampling:

Depth pertubation:

Of course, if you have a large set of SfM reconstructions with corresponding dense depth maps (maybe MegaDepth dataset), you could also train on SfM sparse depth directly, instead of sampling and perturbing RGB-D data.

Optimizing NeRF with Dense Depth Priors

You can provide scenes for NeRF in this format:

scene
│   transforms_train.json
│   transforms_test.json
└───train
│   └───rgb
│   │   │   0.jpg
│   │   │   1.jpg
│   │   │   ...
│   │
│   └───depth
│   │   │   0.png
│   │   │   1.png
│   │   │   ...
│   │
│   └───target_depth
│       │   0.png
│       │   1.png
│       │   ...
│
└───test
    │   # same structure as in "train"

The subdirectory rgb contains rgb images and depth contains the sparse depth maps (e.g. rendered from a SfM reconstruction). target_depth contains ground truth depth (e.g. sensor depth maps), which is not needed for optimizing NeRF and only used for evaluation purposes, like depth metrics. If you do not have ground truth depth for the scene, it may be necessary to comment the respective lines.

transforms_train.json and transforms_test.json contain the following information:

near, far and depth_scaling_factor are scene specific parameters that need to be the same for the train and test set.

EchoTHChen commented 2 years ago

Thanks