Closed FabianSchuetze closed 1 year ago
Hey @FabianSchuetze ,
First of all you need to take the whole intrinsics into account to correctly project a depth map into a pointcloud, see e.g. here:
Then there is the bop_toolkit that helps you to deal with BOP datasets, it can rule out any mistakes done at the loading stage, e.g. matching the instances with the correct poses.
Before I check the PBR data, please project the meshes into the real validation scenes and see if you have the same issues.
Hi @MartinSmeyer - thank you so much for your kind and detailed reply. I will do as you suggest and report back.
Thanks again for your instructions. Two things to note:
Anyway, I'm closing the issue - thanks for your help again.
Ah I see, yes in BOP the model origins are aligned with the center of their 3D bounding boxes. In principle you could compute the offset, but I would just use the BOP models when working with BOP data if possible.
I am trying to project the object meshes into the pointcloud for the PBR LM and ITODD scenes. Particularly for some ITODD objects, the projections are off. I create the pointcloud by turning the depth image with the camera's focal length into the pointcloud. The meshes are loaded (and for LM scales by 1/1000) and then transformed according to their pose with
cam_R_m2c
andcam_t_m2c
. For example, a screenshot of the scene from image folder 25 and scene 247 can be seen below.For the itodd scene, some meshes do not align with the pointcloud, but the procedure works reasonably well for LM. The script used to generate the images is shown below.
What can I do to project a mesh into a pointcloud given a known folder and scene?
Code to generate the projections