NVlabs / latentfusion

LatentFusion: End-to-End Differentiable Reconstruction and Rendering for Unseen Object Pose Estimation
https://arxiv.org/pdf/1912.00416.pdf
Other
213 stars 34 forks source link

Object meshes scale for MOPED dataset? #5

Closed georgegu1997 closed 4 years ago

georgegu1997 commented 4 years ago

Hi, I am using the script here (block "MOPED") to get the observations and object model. However, I cannot figure out the scale of the object meshes. Could you please help me with the following questions?

  1. What's the unit for the points in the model .obj file? It does not seem to be meter.
  2. What's the unit for the depths returned by RealsenseDataset? Is that meter?

Thanks!

keunhong commented 4 years ago

Hi,

1. a) The integrated_registered_processed.obj files are normalized by 1.2 times the maximum bounding length of the object. It's also centered so that centroid is at the origin. This file isn't that useful. b) integrated_registered.obj is the same as above but not normalized. c) pointcloud.ply is an evenly sampled metric pointcloud you should use for evaluation. This is probably the one you want.

  1. If you use the default object_scale='auto' then it normalizes the scale of the scene such that the maximum bounding length (longest edge of the bounding cuboid) is 1.2. This was a lazy way to roughly get the diameter.

These normalization steps are since our network assumes that the object is centered at the origin and fits within the voxel grid spanning -1 to 1, and normalizing it this way makes it convenient to crop and do stuff to it.

The evaluations are all done in metric space, by rescaling everything by the object_scale. RealsenseDataset has some methods that normalize and denormalize the points and extrinsic matrices.

Best, Keunhong

georgegu1997 commented 4 years ago

I figured that out. Thanks for your help!