Closed Sheradil closed 1 year ago
Hi @Sheradil ,
The presence of wrapping on the sharp edges suggests that the issue lies in the intrinsic calibration of the camera. The point cloud wrapping is not affected by the extrinsic calibration, which only involves rotation and translation. However, the intrinsic parameters determine the distortion of the camera lens. I recommend verifying that the camera_info topic contains the accurate camera model and correct values. It is advisable to double-check these details.
If you are currently utilizing our master branch, it could potentially be the source of the issue. Please note that the calib-v2 branch, which is presently undergoing testing, incorporates the camera model into the final projection. if (distortion_model == "fisheye") An easy fix is to replace the assess_calibration.cpp with the one in the calib-v2 branch.
Hi @chinitaberrio and thanks for your quick response.
I asked one of the project partners and he told me that, indeed, the published camera model is wrong. The lens that we used was a fish-eye lens and not a plumb_bob model. And if the model of the camera_info topic is incorrect, then the camera matrix and the distortion coefficients will most likely be wrong as well.
For now I replaced the assess_calibration.cpp with the one of the calib-v2 branch. I'm waiting for the camera to arrive and will try again with calibrated values.
I do have 2 more questions:
Hi @Sheradil
1) Find here branch https://github.com/acfr/cam_lidar_calibration/pull/36#issue-1740771731 the changes to the calib-v2. 2) Not yet, but I have seen some colleagues changing the board extraction method to get the features needed for the calibration.
Hello,
thanks for your really great work. We tried software from other teams and they all were pretty bad. I tried to implement my own, but in the end I had some bugs that I couldn't fix. In the end we found your software and again, thanks for your work. Really easy to use.
Currently I try to calibrate an Ouster128 with an RGB camera (I modified your code so that it works with the 16 bit ring values). Our RGB camera is a plumb_bob distortion model lense and when I looked at your images in this repo I thought you used a non fisheye lense as well (off of my head images from fish eye lenses look differently). And then I calibrated the camera with 28 samples, in my opinion high enough variance (different positions, different xyz angles, different distances - with the 128 layers of the Ouster enough 'rings' will hit the board when placed further away).
Then I ran the "assess" script and got the following result:
My camera is a non fisheye lense. The images that I used were distorted. I thought that it is not necessary to undistort, because your software gets the camera matrix anyway. After I got the result that can be seen in the attached image I rectified the images and calibrated it again, but got almost the same result (that's the 2nd time I calibrated and I kept the distortion vector as it is). I looked at https://github.com/acfr/cam_lidar_calibration/issues/7 and followed your advice: "If you publish a rectified image, the distortion matrix in the camera_infos should also be all zeros. If the D matrix is all zeros then you can leave that line of code. Otherwise, simply feed in a zero matrix in place of the distcoeff.". But if I enter all zeroes in the d vector and calibrate again (3rd attempt) I get projections that are completely wrong (I didn't take an image of that).
Do you have any explanation for the circular projection that can be seen in the attached image (and how to get rid of it)?
Thanks in advance