Open francescomilano172 opened 2 years ago
Thank you for the bug and details.
If k4aviewer
already shows this effect on the pointe cloud, then there is nothing that the ROS node can do for the point cloud on /points2
.
You can use the original colour and depth image and the intrinsics+extrinsics in two ways:
kinect_rgbd.launch
to do the rectification, registration and projection manuallykinect_rgbd.launch
provides better results afterwardsIt may be sufficient to only adjust the extrinsic calibration between the colour and depth camera and keep the intrinsics as they are.
Hi @christian-rauch, thank you for your answer. For number 1., yes, this is what we are already doing, following the procedure in https://github.com/microsoft/Azure_Kinect_ROS_Driver/issues/212. For 2., can you recommend any calibration procedure for the extrinsics between the colour and the IR camera?
Since https://github.com/microsoft/Azure_Kinect_ROS_Driver/pull/200, you can use the camera_calibration
package to assign new intrinsic parameters. You can calibrate the RGB and IR cameras separately.
The extrinsic parameters are published as tf. But I don't know a procedure to does that automatically. You may have to figure that out manually by adjusting the extrinsic tf and comparing the quality of the registration.
Describe the bug Using the factory calibration, point clouds exhibit “color spill”, or “flying pixels” at the object boundaries. This happens both with the latest version of the ROS driver and with the Azure Kinect Viewer. Can this problem be solved through manual calibration?
To Reproduce
Kinect Azure Viewer
k4aviewer
and open the device.View Mode
select3D
andColor
./points2
topic from ROS driverpoint_cloud
andrgb_point_cloud
totrue
inkinect_rgbd.launch
. Also keeppoint_cloud_in_depth_frame
tofalse
, so as to havedepth_to_rgb
-like backprojection. This should be the choice that gives less color spill, as mentioned for instance here.kinect_rgbd.launch
and look at the/points2
topic in RVIZ.Expected behavior In the desired setup, I would need to have aligned and rectified RGB and depth frames and use camera intrinsics to retrieve a point cloud. In the point cloud, there should be no color spill.
Screenshots In all the following screenshots, colorspill on the edges of the pink ball can be noticed
/points2
topic (point_cloud_in_depth_frame
set totrue
):/points2
topic (point_cloud_in_depth_frame
set tofalse
):Desktop:
Additional context It is unclear to me whether color spill is a problem that can be avoided. Previous conversations seem to hint at the fact that this is an inherent limitation of the Azure Kinect (e.g., the paper mentioned here). On the other hand, some threads seem to suggest that the problem might be related to imperfect calibration/camera alignment (e.g., here and here) and that a custom calibration can yield better alignment between the RGB and IR cameras (e.g., here and here). However, the conversations are overall inconclusive, with mixed opinions (negative 1, negative 2, positive, unclear).
I also saw that there is now the possibility to manually calibrate the intrinsics of the cameras and use them through the ROS interface instead of the factory calibration (here and here). Can a custom calibration alleviate or fix this problem, or is it a hardware limitation?
Also, is this to some extent due to the interpolation that is introduced both when warping the depth image into the RGB frame (e.g., here and here) and when rectifying the images before backprojection (as happens in
rgbd_launch
/image_geometry
/cv2.remap
, see e.g., this issue)? In particular, different interpolation schemes produce very different results (see, e.g., here), but even the recommended nearest-neighbor interpolation which is used inrgbd_launch
for rectification (here) does not solve the color spilling problem.