NVlabs / instant-ngp

Instant neural graphics primitives: lightning fast NeRF and more
https://nvlabs.github.io/instant-ngp
Other
15.81k stars 1.9k forks source link

Wrong dimensions of mesh file #1345

Open ddkats opened 1 year ago

ddkats commented 1 year ago

I'm using ingp to extract a colored mesh from a real scene, but when I import the .ply file to MeshLab, I noticed that there is some inaccuracy between the ground truth measurements of some objects in the scene and the measurements produced by ingp. Specifically, the measurements appear to be three times greater than the ground truth results. I wonder if this has something to do with the default scale of 0.33 or the aabe_scale factor. What should I change in the transform.json file to keep the original dimensions of the generated 3D mesh?

Tom94 commented 1 year ago

scale: 1.0 should do the trick. That's the parameter that's 0.33 by default.

ddkats commented 1 year ago

scale: 1.0 should do the trick. That's the parameter that's 0.33 by default.

Thanks for your reply and congrats for the excellent work. That's what I thought but when I set the scale: 1.0, the result seems to be worse. In fact, this happens every time I set the scale above 0.33 (default value). Do you have any other suggestions?

Tom94 commented 1 year ago

In that case, tweaking scale is probably not the answer and I'd recommend instead to downscale the vertices of the output mesh.

Instant NGP makes use of several overlaid grids as part of its data structure, so the scale relative to these grids can impact reconstruction quality -- in your case negatively. It might even pay off to see if your quality keeps improving if you keep reducing scale.

ddkats commented 1 year ago

In that case, tweaking scale is probably not the answer and I'd recommend instead to downscale the vertices of the output mesh.

Instant NGP makes use of several overlaid grids as part of its data structure, so the scale relative to these grids can impact reconstruction quality -- in your case negatively. It might even pay off to see if your quality keeps improving if you keep reducing scale.

Thanks for the input. Indeed, downscale the vertices of the output mesh is what I tried and seems to work!

Saddy21 commented 1 year ago

downscale the vertices of the output mesh.

Hi Can you please explain how you managed to solved this problem, as i am having exactly same issue.

ddkats commented 1 year ago

downscale the vertices of the output mesh.

Hi Can you please explain how you managed to solved this problem, as i am having exactly same issue.

Hi, I loaded the mesh file into MeshLab and scaled it by a factor of 0.33 (each axis). Then, I checked the dimensions of the mesh with the ground truth, and they appeared to be correct and extremely close to the original dimensions. However, when I tested it with another mesh from ingp, it did not work. So, I am still facing a problem with the dimensions, and I am not sure which is the magic formula to determine the scaling factor each time.

Saddy21 commented 1 year ago

Exactly It seems to upscale by a different factor every time(when using different dataset) ,even for same object.

ddkats commented 1 year ago

Exactly It seems to upscale by a different factor every time(when using different dataset) ,even for same object.

And the aabe_scale doesn't seem to have an impact on the dimensions of the mesh...

Saddy21 commented 1 year ago

Exactly It seems to upscale by a different factor every time(when using different dataset) ,even for same object.

And the aabe_scale doesn't seem to have an impact on the dimensions of the mesh...

Yes It seems there is some issue in the pipeline itself Colmap function works fine in photogrammetry eg Meshroom or Mic Mac,the dimensions are accurate there,but in nerf there seems be some issue.

ddkats commented 1 year ago

Perhaps @Tom94 can enlighten us on this matter.

cubantonystark commented 1 year ago

In order to adjust the scale I recommend you either set a scale ratio manually within meshlab (see here https://www.youtube.com/watch?v=6psAppbOOXM) or use pymeshlab to automate the process, mind you, you would have to have either a real measurement or an approximate to calculate the ratio. Hope this helps.

ddkats commented 1 year ago

In order to adjust the scale I recommend you either set a scale ratio manually within meshlab (see here https://www.youtube.com/watch?v=6psAppbOOXM) or use pymeshlab to automate the process, mind you, you would have to have either a real measurement or an approximate to calculate the ratio. Hope this helps.

I appreciate your help. However, it's not possible for me to use the real measurements to calculate the scale ratio. The fact is we don't know how the scale factor of ingp mesh is calculated, because the mesh scale is different from dataset to dataset, @Tom94 can we have any update on this? Thanks

Saddy21 commented 1 year ago

What exactly is the scale ratio? If you try nerf on a same object with 2 different dataset,it will give different mesh i.e different scale both the time, as opposed to our assumption that scale might remain same all the time. Sometime it would upscale by 2, sometime by 2.5.That is the problem.We cant keep this scale ratio fixed. In the script it is mentioned as 0.33 but still it doesn't help. Hope you understand. Thanks

Tom94 commented 1 year ago

This sounds to me like your transforms.json have an inherently different scale across datasets (which carries through to the mesh). This is unavoidable if you use e.g. COLMAP to estimate camera parameters for independent datasets as there's inherent ambiguity scene scale vs. camera distance. You need some form of external calibration to carry a consistent scale across multiple datasets -- my responses earlier in the thread assumed that to be the case and transforms.json to have physical units.

In any case, if you use COLMAP, you could try commenting out lines 379 to 385 from colmap2nerf.py.

avglen = 0.
for f in out["frames"]:
    avglen += np.linalg.norm(f["transform_matrix"][0:3,3])
avglen /= nframes
print("avg camera distance from origin", avglen)
for f in out["frames"]:
    f["transform_matrix"][0:3,3] *= 4.0 / avglen # scale to "nerf sized"

This will likely lessen the discrepancy (but not completely remove it) at the cost of slightly worse reconstruction quality.

ddkats commented 1 year ago

Hi guys,

For the last few days I've been testing another pipeline that uses sparse reconstruction output of the OpenMVG library to produce a 3D mesh. So I checked if COLMAP's sparse reconstruction result can cause the same scaling issues and as it turns out, COLMAP was the source of the problem and I think that's the case here as well.

@Saddy21 @cubantonystark @Tom94 Thanks for the contribution

asd351012 commented 8 months ago

hello Thanks for the contribution Can OpenMVG solve the issue of dimension errors?

amughrabi commented 6 months ago

AFAICT, I still have the same problem with this thread. I use Colmap and Intel Realsense t265 to make the poses up to scale. However, the generated meshes are unitless ([0...1]). I used the scale parameter and got the same result. The actual width, height, and depth = 5.6, 5.6, 5.6 CM. I also assigned the same Colmap output to OpenMVS, and OpenMVS scaled the meshes as expected. My assumption is that instant-ngp is rescaling the poses somewhere in the implementation (e.g., scale). How can we reserve the actual Mesh dimensions?

image