EPVelasco / lidar-camera-fusion

The code implemented in ROS projects a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor on an image from an RGB camera.
GNU General Public License v3.0
223 stars 44 forks source link

Why camera_matrix is 3*4 matix and not 3*3 ? #16

Open YHuniv opened 10 months ago

YHuniv commented 10 months ago

Hi,

Using the camera_calibration packages, we get a 3*3 camera matrix 430.215550 0.000000 306.691343 0.000000 430.531693 227.224800 0.000000 0.000000 1.000000 https://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration

But in your cfg_params.yaml you are using 3*4

camera_matrix: [915.828857, 0.000000, 644.015094, 0.000000, 0.000000, 925.069885, 372.912666, 0.000000, 0.000000, 0.000000, 1.000000, 0.000000,]

How did you get that ? thanks

EPVelasco commented 10 months ago

Hello, just add a column of zeros in the final representation of the matrix:

camera matrix: [430.215550, 0.000000, 306.691343, 0.000000, 0.000000, 430.531693, 227.224800, 0.000000, 0.000000, 0.000000, 1.000000, 0.000000]

YHuniv commented 10 months ago

Thanks, you did that because of the RTLc matcix in the .cpp because it's 4*4 !!

How did you the camera and lidar calibration ? ( i tried cam-lidar_calibration package) my result is in this format (roll, pitch, yaw,x,y,z). How to convert to rlc and tlc format ? I think for tlc is directly : x,y and z.
thanks again

EPVelasco commented 10 months ago

Yes, tlc is directly x,y,z. And for roll, pitch and yaw you can use the Eigen library, check this example

YHuniv commented 10 months ago

Hi again, Still facing a problem with the fusion. Like a told you I used the this packge for calibration : https://github.com/acfr/cam_lidar_calibration/tree/master The result seems to be good. (see pictures).

fusion_camlid

Calibratio parmas are : Calibration params (roll,pitch,yaw,x,y,z): -1.5806,-0.0090,-1.4611,0.1701,0.0370,-0.4268

Mean reprojection error across 53 samples

NoFusion

I dont understand where is the error, any idea ? thanks by advance

EPVelasco commented 9 months ago

Hi, could you send a picture of how your camera and LiDAR are located?

I also suggest you to check this issue. It shows there how the homogeneous transformation matrix is used in the application, as the tool I used to calibrate the sensors considers a different reference axis for the LiDAR compared to the method you have used for the LiDAR-Camera calibration.

YHuniv commented 9 months ago

Hi,

Here a picture for my confiuration. (velodyne vlp 16 and monucular camera). tf_lid_cam

In Rviz, I used my calibration pameters for transformation : rosrun tf static_transform_publisher 0.1701 0.0370 -0.4268 -1.5806 -0.0090 -1.4611 velodyne camera_frame 100. It seems correct, i saw same configuration in you ranswer in issue(see picture). tfLid_cam

thanks

EPVelasco commented 9 months ago

I think the main problem is the representation of your axes in the camera frame. I suggest you to check this issue, it shows how the Lidar-camera axes are represented. To know if this is the problem, do a quick test with the rotation matrix RPY (0,0,0) and only with tlc (0.1701 0.0370 -0.4268), that is in the file cfg_params.yaml

rlc: [ 1.0 ,0.0 ,0.0,
0.0 , 1.0 ,0.0,
0.0 , 0.0 ,1.0 ]

tlc: [ 0.1701, 0.0370, -0.4268]

On the other hand, a problem I see also in the future is that your RGB image is not rectified. The image is distorted due to the optical characteristics of the lens. I suggest you check how to rectify the image, here is an example.