microsoft / Azure_Kinect_ROS_Driver

A ROS sensor driver for the Azure Kinect Developer Kit.
MIT License
304 stars 226 forks source link

What is the connection between set resolution of the camera and the factory calibration? #213

Open Enes1097 opened 3 years ago

Enes1097 commented 3 years ago

Hey, I'm currently working on an extrinsic spatial calibration between a Velodyne LiDAR and the Azure Kinect Camera with this package (https://github.com/beltransen/velo2cam_calibration).

While computing this package, I saw some weird output values that I believe are solely related to the settings of the Kinect Camera: When I set the resolution of the camera to 720p and compute the calibration package, it gives me quite accurate extrinsic calibration parameters. If I then change the resolution of the camera to f.e 1080p or 1536p, the extrinsic calibration parameters change too.

Below you can see a list of the outputs for all resolutions: (the orientations are in quaternions) 720p: x=0.0420; y=-0.0131; z=-0.1695; roll=0.024; pitch=0.0001; yaw=0.0101 1080p: x=0.0426; y=-0.0137; z=-0.1699; roll=0.0024; pitch=0.0002; yaw=0.0101 1440p: x=0.0428; y=-0.0142; z=-0.1702; roll=0.0019; pitch=0.0000; yaw=0.0104 1536p: x=0.0252; y=-0.0108; z=-0.1727; roll=0.0011; pitch=-0.0006; yaw=0.0109 2160p: x=0.0443; y=-0.0141; z=-0.1679; roll=0.0021; pitch=0.0015; yaw=0.0104 3072p: x=0.0262; y=-0.0138; z=-0.1704; roll=0.0010; pitch=0.0003; yaw=0.0108

I believe that these parameters however should stay roughly the same for every resolution because the sensors are fixed rigidly to a platform and are not being moved throughout the whole sensor data acquisition.

--> For reference, the optimal values should be: x=0.0458; y=0; z=-0.1512; roll=0.0; pitch=0.0; yaw=0.0

The package can compute images distorted with the rational polynomial distortion model and works on rolling shutter cameras too, so I don't see the problem lying there.

Is it possible that the intrinsic calibration parameters were acquired with one resolution and therefore are unaccurate for all other resolutions?

Thanks.

ooeygui commented 3 years ago

The ROS node passes transforms the factory calibration in https://github.com/microsoft/Azure_Kinect_ROS_Driver/blob/melodic/src/k4a_calibration_transform_data.cpp. @qm13 - can you shed some light on this question?

Enes1097 commented 3 years ago

Hello @ooeygui @qm13 , I read through the cpp.code and saw how the intrinsic matrix is built. Sadly I still don't really understand where the parameters for the intrinsic matrix come from? Is it from the function "k4a_calibration_camera_t"? Do they change depending on the set resolution or are they constant parameters?

`void K4ACalibrationTransformData::printCameraCalibration(k4a_calibration_camera_t& calibration) { printExtrinsics(calibration.extrinsics);

ROS_INFO("\t\t Resolution:"); ROS_INFO_STREAM("\t\t\t Width: " << calibration.resolution_width); ROS_INFO_STREAM("\t\t\t Height: " << calibration.resolution_height);

ROS_INFO("\t\t Intrinsics:"); ROS_INFO_STREAM("\t\t\t Model Type: " << calibration.intrinsics.type); ROS_INFO_STREAM("\t\t\t Parameter Count: " << calibration.intrinsics.parameter_count); ROS_INFO_STREAM("\t\t\t cx: " << calibration.intrinsics.parameters.param.cx); ROS_INFO_STREAM("\t\t\t cy: " << calibration.intrinsics.parameters.param.cy); ROS_INFO_STREAM("\t\t\t fx: " << calibration.intrinsics.parameters.param.fx); ROS_INFO_STREAM("\t\t\t fy: " << calibration.intrinsics.parameters.param.fy); ROS_INFO_STREAM("\t\t\t k1: " << calibration.intrinsics.parameters.param.k1); ROS_INFO_STREAM("\t\t\t k2: " << calibration.intrinsics.parameters.param.k2); ROS_INFO_STREAM("\t\t\t k3: " << calibration.intrinsics.parameters.param.k3); ROS_INFO_STREAM("\t\t\t k4: " << calibration.intrinsics.parameters.param.k4); ROS_INFO_STREAM("\t\t\t k5: " << calibration.intrinsics.parameters.param.k5); ROS_INFO_STREAM("\t\t\t k6: " << calibration.intrinsics.parameters.param.k6); ROS_INFO_STREAM("\t\t\t codx: " << calibration.intrinsics.parameters.param.codx); ROS_INFO_STREAM("\t\t\t cody: " << calibration.intrinsics.parameters.param.cody); ROS_INFO_STREAM("\t\t\t p2: " << calibration.intrinsics.parameters.param.p2); ROS_INFO_STREAM("\t\t\t p1: " << calibration.intrinsics.parameters.param.p1); ROS_INFO_STREAM("\t\t\t metric_radius: " << calibration.intrinsics.parameters.param.metric_radius); } `

christian-rauch commented 3 years ago

For reference, the optimal values should be: x=0.0458; y=0; z=-0.1512; roll=0.0; pitch=0.0; yaw=0.0

Where did you take these values from? From the CAD, I can see that there is some rotation between the depth and colour sensor. So it should not be rpy=(0,0,0).

The intrinsics are resolution-dependent and stored inside the device during the factory calibration. These intrinsics are read from the device when the camera is started (K4AROSDevice::startCameras): https://github.com/microsoft/Azure_Kinect_ROS_Driver/blob/c0742b9e470c9e688d796029f10cb52e1a763a4a/src/k4a_calibration_transform_data.cpp#L24-L29 and this is then used to set the ROS sensor_msgs/CameraInfo: https://github.com/microsoft/Azure_Kinect_ROS_Driver/blob/c0742b9e470c9e688d796029f10cb52e1a763a4a/src/k4a_calibration_transform_data.cpp#L308-L358

Enes1097 commented 3 years ago

For reference, the optimal values should be: x=0.0458; y=0; z=-0.1512; roll=0.0; pitch=0.0; yaw=0.0

Where did you take these values from? From the CAD, I can see that there is some rotation between the depth and colour sensor. So it should not be rpy=(0,0,0).

I was trying to do an extrinsic calibration between a Velodyne LiDAR Sensor and the Azure Kinect DK Camera, not between the color and depth camera of the Azure Camera itself, sorry for the misleading description. Inbetween the different tests, that I carried out, I used a stationary sensor setup with a fixed target pose and only changed the resolution of the camera (Settings of LiDAR were fixed), which is why I was suprised to see that the calibration output (or better the translation in x-direction) of the ROS Node is so vastly different inbetween some of the tests. In the Sensorrack where both cameras are rigidly mounted there is few to zero rotation between the sensors. Also I was mistaken. The rotation values correspond to roll, pitch and yaw values and not quaternions, obviously :)

The intrinsics are resolution-dependent and stored inside the device during the factory calibration. These intrinsics are read from the device when the camera is started (K4AROSDevice::startCameras):

https://github.com/microsoft/Azure_Kinect_ROS_Driver/blob/c0742b9e470c9e688d796029f10cb52e1a763a4a/src/k4a_calibration_transform_data.cpp#L24-L29

and this is then used to set the ROS sensor_msgs/CameraInfo: https://github.com/microsoft/Azure_Kinect_ROS_Driver/blob/c0742b9e470c9e688d796029f10cb52e1a763a4a/src/k4a_calibration_transform_data.cpp#L308-L358

Thank you for the explanation, this clears it up for me 👍