microsoft / Azure_Kinect_ROS_Driver

A ROS sensor driver for the Azure Kinect Developer Kit.
MIT License
304 stars 226 forks source link

Issues with Undistorting RGB and Depth data (for undistorted point clouds) #247

Open Stan850 opened 2 years ago

Stan850 commented 2 years ago

Hi, I have tried to undistort RGB and depth data from the Azure Kinect ros driver however the images I obtain still look warped (see examples below). I am focusing on the RGB case here because I have done the same processing steps for depth data.

The first approach I used was to use kinect_rgbd.launch to obtain the rectified images which is supposed to use the image_proc and depth_image_proc nodes to undistort and rectify images. However as marked on the image, there is still some warping happening around the corners of the image:

kinect_rgbd

The second approach I used was to modify the ros driver to undistort both rgb (and depth images) before publishing to the topics /rgb/image_raw and /depth/points2. In this case, I feed in the existing rgb_buffer_mat data into cv::undistort along with the factory camera intrinsics and distortion coefficients. I perform the same operation on the depth_frame_buffer_mat data before it is fed into the depth_image_to_point_cloud function to convert the depth data to a point cloud.

When I extract a RGB image, I get a similar output as observed in the first approach I used:

manual_undistorted

I am posting an example of what the raw output looks like, as seen below:

image_raw

Is there a reason why the warping occurs or how to eliminate it? I need to use undistorted data in my case for an object detection algorithm that only supports undistorted data.

Desktop Environment Details:

Thanks in advance!