microsoft / Azure-Kinect-Sensor-SDK

A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device.
https://Azure.com/Kinect
MIT License
1.47k stars 613 forks source link

Potential depth misalignment between depth camera and rgb camera #1946

Open Basilel7 opened 10 months ago

Basilel7 commented 10 months ago

I'm working on a project where I need to localize an object with RGB and Depth streams. I'm using ArUCO library to localize the object on the RGB image and I transform the Depth in a PointCloud in the RGB camera 3D space.

The object is supposed to be at the same location in the 3d RGB camera space but the the depth image of the object and the ArUCO pose estimaton of the object are about 1cm to 2cm of distance on the camera axis mostly. It's not likely from ArUCO pose estimation as previous experiments showed us it was more precise than that. So the error should come from depth estimation from the camera or from the depth alignment to the RGB camera space (depth/rgb intrinsics or depth camera to rgb camera Extrinsics).

bk2 bk

in red, the mesh project with ArUCO and in black the point cloud from the depth camera. Selecting directly the ArUCO marker center on the pointcloud and the ArUCO pose estimation confirms the depth offset.

I'm working on linux 20.04