nerfstudio-project / nerfstudio

A collaboration friendly studio for NeRFs
https://docs.nerf.studio
Apache License 2.0
8.98k stars 1.2k forks source link

How to train depth-nerfacto using monocular depth? #2858

Open YuiNsky opened 5 months ago

YuiNsky commented 5 months ago

Thanks for the great work of this project, i wanna know how to use monocular estimated depth to supervise the training? since colmap depth is too sparse

peasant98 commented 5 months ago

I believe depth nerfacto is hooked up to DepthDataset, where if you don't provide any depth data, then ZOE (state of the art monocular depth estimation) depth images will be generated. To be honest though, I would recommend a way that I think will perform better:

  1. Run colmap depth and get the sparse points
  2. Run the monocular depth model on the images.
  3. For each image, compute a scale factor and offset with respect to the colmap points seen in each image.
  4. Multiply the depth image from the model by the scale factor and add the offset
  5. Train the nerf!
aeskandari68 commented 5 months ago

I believe depth nerfacto is hooked up to DepthDataset, where if you don't provide any depth data, then ZOE (state of the art monocular depth estimation) depth images will be generated. To be honest though, I would recommend a way that I think will perform better:

  1. Run colmap depth and get the sparse points
  2. Run the monocular depth model on the images.
  3. For each image, compute a scale factor and offset with respect to the colmap points seen in each image.
  4. Multiply the depth image from the model by the scale factor and add the offset
  5. Train the nerf!

As you mentioned, without having depth information, depth-nerfacto uses ZOE to estimate the depth. However, when I run it, I encounter the following error. Any feedback?

/nerfstudio/models/depth_nerfacto.py", line 86, in get_metrics_dict
    raise ValueError(
ValueError: Forcing pseudodepth loss, but depth loss type (DepthLossType.DS_NERF) must be one of (<DepthLossType.SPARSENERF_RANKING: 3>,)

https://github.com/nerfstudio-project/nerfstudio/blob/242c23f0f067064c16c49376c02271cd1cd2303b/nerfstudio/models/depth_nerfacto.py#L79-L88

MartinEthier commented 5 months ago

I believe depth nerfacto is hooked up to DepthDataset, where if you don't provide any depth data, then ZOE (state of the art monocular depth estimation) depth images will be generated. To be honest though, I would recommend a way that I think will perform better:

  1. Run colmap depth and get the sparse points
  2. Run the monocular depth model on the images.
  3. For each image, compute a scale factor and offset with respect to the colmap points seen in each image.
  4. Multiply the depth image from the model by the scale factor and add the offset
  5. Train the nerf!

As you mentioned, without having depth information, depth-nerfacto uses ZOE to estimate the depth. However, when I run it, I encounter the following error. Any feedback?

/nerfstudio/models/depth_nerfacto.py", line 86, in get_metrics_dict
    raise ValueError(
ValueError: Forcing pseudodepth loss, but depth loss type (DepthLossType.DS_NERF) must be one of (<DepthLossType.SPARSENERF_RANKING: 3>,)

https://github.com/nerfstudio-project/nerfstudio/blob/242c23f0f067064c16c49376c02271cd1cd2303b/nerfstudio/models/depth_nerfacto.py#L79-L88

Try setting --pipeline.model.depth-loss-type SPARSENERF_RANKING