Totoro97 / f2-nerf

Fast neural radiance field training with free camera trajectories
https://totoro97.github.io/projects/f2-nerf/
Apache License 2.0
921 stars 68 forks source link

export point cloud in orginal space #80

Closed Bin-ze closed 1 year ago

Bin-ze commented 1 year ago

I try to use f2-nerf to export point cloud, the method is very simple, just map the depth value into 3d space:

Tensor point = rays_o.to(torch::kCPU) + rays_d.to(torch::kCPU) * pred_depth;

but I found that the exported point cloud has some distortion problems, which is also normal, if the sample is trained in warping space , how do I map the point cloud to Euclidean space?

Bin-ze commented 1 year ago

I have another question, what is the range of the point cloud prediction result? Is it in a [-1,1] box?

meriemjabri commented 1 year ago

so did you manage to predict depth ? or do you mean *pred_disps?

meriemjabri commented 1 year ago

and could you please elaborate on what other changes we need to make to get the point cloud, what file we need to check?

Bin-ze commented 1 year ago

yes, f2-nerf can predict depth , so can export point cloud, but depth in warping space(), you can check these: https://github.com/nerfstudio-project/nerfstudio/blob/main/nerfstudio/exporter/exporter_utils.py Another point is that how to filter points that are far away needs to be considered, which affects the quality of the point cloud

Bin-ze commented 1 year ago

@Totoro97 shuaige, could you tell me how to export point cloud?

maliksyria commented 1 year ago

@Bin-ze I'm having exactly the same issue. Generated disparity has issues in distortion, Have you managed to solve that?

zZH1222hui commented 7 months ago

@龙猫97帅哥,你能告诉我如何导出点云吗?

Hi,Have you found the method of exporting point cloud? How to operate it specifically?