hku-mars / r3live

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
GNU General Public License v2.0
1.97k stars 430 forks source link

Large drift issue! #45

Closed aditdoshi333 closed 2 years ago

aditdoshi333 commented 2 years ago

Hello @ziv-lin,

I had an issue while mapping sometimes I am getting the large drift using r3live. But if I use vanilla fastlio 2 there is no visible drift. I was wondering how is that possible? Like for LIO r3live is also using fastlio right. Do I have to do any config changes? In some cases, r3live is performing exceptionally well. And I don't think there is any compute limitation as I am using 64 core 512 GB RAM.

Thank you

ziv-lin commented 2 years ago

Have you configured the hardware correctly (e.g., the bad result of calibration)? And more, is there any wrong with the input camera data (e.g., underexposure or overexposure? the dramatic jump of image timestamp?).

aditdoshi333 commented 2 years ago

I think it is an exposure issue. What do you suggest auto exposure or manual set?

ziv-lin commented 2 years ago

For my case, I prefer to set the exposure mode as auto-exposure.

aditdoshi333 commented 2 years ago

Yeah thanks @ziv-lin , I think I need to try with a better camera.

I wanted to try the FLIR camera same as yours but that is not in stock. So I ordered ( https://www.edmundoptics.com/p/basler-ace-aca1300-200uc-color-usb-30-camera/3419/ ). Do you have any comments on camera selection? Any thing to keep in mind for r3live best performance?

Thanks

ziv-lin commented 2 years ago

No, I have no suggestions on the selection of the camera.

aditdoshi333 commented 2 years ago

Okay thanks

seajayshore commented 2 years ago

@aditdoshi333 Incase it helps, I use a Basler a2a1920-160uc camera which is similar to the one you use and I get very good results so far.

Both my camera & your camera have USB interface and as so they don't have PTP-timestamp capability like with Ethernet cameras. You have to use trigger & internal timestamp counters carefully to obtain good timestamping & synchronisation with these USB cameras. (You can't trust the timestamp given by ROS at the time of receiving the image from USB, you have to use the camera's internal timestamp).

I know that R3LIVE/R2LIVE tries to estimate time-sync online but I wonder if your drift is caused by either poor time synchronisation - or else maybe poor calibration. Most of the issues I find trying to make R3LIVE & similar packages to work properly come from these two factors...

aditdoshi333 commented 2 years ago

Hello @seajayshore,

I got my new Basler camera and I am trying to calibrate it with lidar using (https://github.com/hku-mars/livox_camera_calib) but I am struggling a lot with the same. Can you please throw some light on how you calibrated your camera with lidar?

seajayshore commented 2 years ago

Hey @aditdoshi333 for extrinsic (spatial) calibration between Lidar & Camera I used the same tool you linked above. I think you've had feedback on other issues now so maybe you solved it already but incase you're still struggling here's my current status:

Before anything else I tried to make sure I had a good intrinsic calibration of the camera & lens. I have only used the ROS basic / checkerboard based tool so far but I recently got Kalibr working so will hopefully have even better soon.

Then what worked best for me with the livox_camera_calib (lidar extrinsics) tool was to record 4 different static scenes with good light, pretty clear geometric structure (e.g. corner of a simple building).

I don't get very good results with this tool using just single scenes but when using this set of 3-4 varied scenes with the "multi_calib.launch" process it produced pretty impressive results.

As noted in another issue you asked (this one) to use this extrinsic in R3Live you have to invert/transpose the rotation (3x3) part of the matrix.

I'm still honestly confused how to transform the translation extrinsic correctly but for my data in R3LIVE it looks great to just leave translation extrinsic as [0,0,0] and enable it to estimate extrinsic in the config file.

If you are still having issues and you're sure your camera intrinsics, lidar/camera extrinsic are all correct then maybe you have issues with either:

Camilochiang commented 2 years ago

I tried kalibr a couple of months ago @seajayshore but the distortion model that r2live and r3live is not available on kalibr so not sure if it will work better, or at least didn’t in my case. also kalibr is for calibrating between camera and imu, and r2 and r3live use calibration between camera and LiDAR (correction to imu is done inside the code I thing)

let us know how it goes in your case!

seajayshore commented 2 years ago

@Camilochiang thanks for the comments - and your other issues/questions on R3Live & others! They have helped me!

For the distortion model: It seems this is a problem of people using different names for the same thing!

These are all just different names for the same thing (some sources here, here, here + more if you google these keywords)... So if you use the "pinhole-radtan" model in Kalibr it gives k1, k2, p1, p2 parameters needed by ROS/R3Live/VINS, etc. (But no k3 parameter... Reading on OpenVINS github it seems this doesn't matter for them so who knows...)

Anyway, I computed the new intrinsic in Kalibr yesterday and will use it to update all my dependant calibrations ASAP. Will let you know if the results are better/worse.

As for camera / IMU calibration: Yes I have seen R2/R3LIVE extrinsic are a bit ambiguous... I read that they "treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same" (comment here) so that suggests to really use the lidar-camera extrinsic as stated in the VIO config file.

I am doing Camera/IMU calibration because I have an external IMU and want to try and correct for Lidar-to-IMU extrinsics in the Lidar front-end. Currently I don't find a good Lidar-to-IMU calibration tool (until this is released) so I am going to try and use my Camera-to-Lidar + Camera-to-IMU extrinsics to calculate one (as suggested here).

I'll update when I manage to complete.

Camilochiang commented 2 years ago

Hei @seajayshore Thanks for your post. I will recheck the models as now im curious and I dont remember the details! Thanks!

I have a similar situation using an external BMI088 with a Livox Mid-70 (We are interested in been able to use the LiDAR at really short distances, so any other LiDAR is not really useful)

Some comments about an external IMU that may help you

aditdoshi333 commented 2 years ago

Hello @Camilochiang @seajayshore,

Thanks a lot for the inputs, I have one more doubt what kind of image input do we need to give in r3live rectified or distorted? My issue is I am able to calibrate extrinsic pretty well using (https://github.com/hku-mars/livox_camera_calib) as every RGB edge is matching with the lidar edge so that looks good. But even after inverting the matrix, I am not able to get a similar result in r3live. And I think syncing is not an issue as I am testing on a static scene.

Calibration output:

Screenshot 2022-03-18 at 12 24 01 PM

r3live output (Same scene that of calibration)

Screenshot 2022-03-08 at 7 21 02 PM

I tried improving intrinsics but it is coming similar every time and even the projection error is quite low. I am really confused about where I am going wrong. Any light is appreciated.

Thank you guys

Camilochiang commented 2 years ago

hei @aditdoshi333 , my apologies but what is your problem? I have seen that image of you before and I dont get /understand where is your issue? For me look quite good.

aditdoshi333 commented 2 years ago

Hello @Camilochiang ,

The problem is while calibrating I am getting an exact edge-to-edge matching of RGB and lidar. But after inverting the matrix when I am running in r3live there is an offset in color mapping. For example in the above r3live output image if you see the edge of the pixel is coming at the center and the whole color mapping is wrong. But in the calibration output image, all the edges are properly aligned.

Thank you

Camilochiang commented 2 years ago

Sorry, what do you mean with the edge of the pixel? Do you mean the edge of the point cloud?Like the colors look correct to me. Could you share a picture to see the real colors ?

I think that I see what you mean. Can you share the matrix that you got before and after "inverting"? Thanks

EDIT: Now I understand what you mean I think. Is hard to see the missmatching in 3D-

aditdoshi333 commented 2 years ago

Hello @Camilochiang,

Sorry for the late reply. Ya, surely I can share the intrinsic and extrinsic with you.

Camera instrinsics : ost.txt

Camera extrinsic: Raw: [[0.00811628,-0.999494,-0.0307492,-0.0686258], [-0.0043803,0.0307144,-0.999519,0.0145342], [0.999957,0.00824706,-0.0041288,-0.0395998], [0,0,0,1]]

Inverted: [ 0.00811628, -0.0043803 , 0.999957 , 0.04021875], [-0.999494 , 0.0307144 , 0.00824706, -0.0687109 ], [-0.0307492 , -0.999519 , -0.0041288 , 0.01225352], [ 0. , 0. , 0. , 1. ]

I am highly thankful to you for your support.

Camilochiang commented 2 years ago

Everything looks coherent for me @aditdoshi333 . Do you also give the transformation values to the camera_ext_R of R3Live? (Last colum of your inverted matrix)? R3live use the inverted matrix (without the last column! so a matrix of 3x3) as ext_R and an additional camera_ext_t array with the last column of the inverted matrix. What I have observe is that R3Live work better with the camera_ext_t array is set to 0, even if this is not true, but as I mentioned above is cos in my case the Lidar and IMU are really close. Maybe give it a try like that?

aditdoshi333 commented 2 years ago

Hello @Camilochiang,

I tried both ways of giving the last column and even setting it 0 for translation. But nothing seems working. Even in my case lidar and imu are really close as I am using livox avia. Do you have any comments on the intrinsics of the camera? I am really blank on how to further debug this issue.

Thank a lot for the help.

Camilochiang commented 2 years ago

mmmm

This is my configuration. Give it a try just to check how it looks

3live_vio:
   image_width: 1920
   image_height: 1080
   camera_intrinsic:
      [1344.726575185332, 0.0, 939.2360719294088,
      0.0,  1336.144803263191, 536.1585308487656,
      0.0, 0.0, 1.0 ] 
   camera_dist_coeffs: [-0.01841394411245675, 1.403767536853783, -0.004610713008191326, -0.005306367151226178, -3.079483857895877]  #k1, k2, p1, p2, k3
   # Fine extrinsic value. form camera-LiDAR calibration.
   camera_ext_R:
         [0.0129924, -0.0115002, 0.999849,
            -0.999817,  -0.0141637, 0.0128291,
            0.014014,  -0.999834,  -0.0116821]
   # camera_ext notes: 
   #camera_ext_t: [-0.0139635, 0.0542981, -0.0104072] 
   camera_ext_t: [0,0,0] 
aditdoshi333 commented 2 years ago

Umm no luck..

Do you think is there anything related to fov? Because in this case fov of the camera is more then lidar.

Camilochiang commented 2 years ago

Mmm it could be but not sure. You can try limiting the FOV of your camera no? You need to cut the image via software (not reducing the quality), recalibrate with these modified images and check if it works!

ly-uuu commented 2 years ago

Hello @Camilochiang,

Sorry for the late reply. Ya, surely I can share the intrinsic and extrinsic with you.

Camera instrinsics : ost.txt

Camera extrinsic: Raw: [[0.00811628,-0.999494,-0.0307492,-0.0686258], [-0.0043803,0.0307144,-0.999519,0.0145342], [0.999957,0.00824706,-0.0041288,-0.0395998], [0,0,0,1]]

Inverted: [ 0.00811628, -0.0043803 , 0.999957 , 0.04021875], [-0.999494 , 0.0307144 , 0.00824706, -0.0687109 ], [-0.0307492 , -0.999519 , -0.0041288 , 0.01225352], [ 0. , 0. , 0. , 1. ]

I am highly thankful to you for your support.

why change the last column?

farhad-dalirani commented 1 year ago

Hi @aditdoshi333 @Camilochiang @seajayshore ,

I think my question in this issue https://github.com/hku-mars/r3live/issues/157 is related to what your discussed. It would be great if you look at it.

redheli commented 1 year ago

@aditdoshi333 Incase it helps, I use a Basler a2a1920-160uc camera which is similar to the one you use and I get very good results so far.

Both my camera & your camera have USB interface and as so they don't have PTP-timestamp capability like with Ethernet cameras. You have to use trigger & internal timestamp counters carefully to obtain good timestamping & synchronisation with these USB cameras. (You can't trust the timestamp given by ROS at the time of receiving the image from USB, you have to use the camera's internal timestamp).

I know that R3LIVE/R2LIVE tries to estimate time-sync online but I wonder if your drift is caused by either poor time synchronisation - or else maybe poor calibration. Most of the issues I find trying to make R3LIVE & similar packages to work properly come from these two factors...

Hi @seajayshore , Could I ask what lens you use for the Basler a2a1920-160uc camera? I am using livox HAP and zed 2i camera, the result seems ok. But it is rolling shutter, so I want to try your camera setting.

seajayshore commented 1 year ago

@redheli Sorry for slow reply. I use an Edmund Optics Cr-series 3.5mm lens with f/4 aperture (exact model here).

Edmund Optics have various similar cameras with a wide variety of focal lengths. Some have adjustable aperture and others are fixed aperture. Some cameras have special properties (e.g. wateproof, more ruggedised, etc.). It's easy to be overwhelmed with choices!

Whatever lens you choose also make sure its "image circle" is larger than the sensor size or the section of the sensor that you want to actually see light through the lens.

I use the Cr-series as they are supposed to be extremely stable & ruggedised - meaning they can survive with vibrations & remain calibrated after a long time in use. (I don't confirm this myself but their testing looks good to me).

Alternatively the C-Series is maybe better as a more "basic" version where you can also adjust aperture.


@farhad-dalirani Sorry no-one managed to reply to you it seems. But I checked your issue and looks like you solved it with better calibration! Well done - hopefully it's all good for you now!

fanshixiong commented 1 year ago

@seajayshore How did you solve the drifting problem? I have a drifting problem with Livox Lidar Mid-70 + MYNT IMU + MYNT camera. I explained it in detail. It would be great if look at it: #173

farhad-dalirani commented 1 year ago

@fanshixiong 30 frames per second image. Super accurate camera-lidar calibration: https://github.com/AFEICHINA/extended_lidar_camera_calib

fanshixiong commented 1 year ago

@farhad-dalirani thanks. The imu and lidar of my device are not together, so I need to calibrate lidar and imu, imu and camera. Lidar and camera should not be calibrated. Is there any more accurate imu and camera calibration method?

farhad-dalirani commented 1 year ago

@fanshixiong 1- R3Live heavily depended on camera and LiDAR calibration! A huge drift happens if the calibration between those two is bad.

2- I used Velodyne-16 and it does not have an internal IMU. I used an external IMU and put it under LiDAR in a way that the axes of LiDAR and IMU are aligned. This approximation is sufficient.

fanshixiong commented 1 year ago

/@farhad-dalirani thank you for your reply. In my device, I use a mynt camera with its own imu. Devices for cameras and imu are placed under the radar. I calibrated the camera and imu through kalibr. Calibrated the radar and imu using their engineering:hku-mars/LiDAR_IMU_Init. Which parameter in the config does the calibration between the radar and the camera you mentioned reflect? This is the calibration parameter of radar and imu:

Lidar_front_end:
   lidar_type: 1   # 1 for Livox-avia, 3 for Ouster-OS1-64
   N_SCANS: 6
   using_raw_point: 1
   point_step: 1
   lidar_imu_rotm:
      # LiDAR is mounted rotated by 90 deg
      #[1, 0, 0,
      # 0, 0, 1,
      # 0, -1, 0]
      [ 0.016511, -0.999700,  0.018083,
       0.057071,  0.018999,  0.998189,
       -0.998234, -0.015449,  0.057368]
   lidar_imu_tranm: 
      [0.039342, 0.077608, 0.037443]

Here are the calibration parameters of the camera and imu:

r3live_vio:
   image_width: 1280
   image_height: 720
   camera_intrinsic:
       [655.005, 0, 679.029,
       0, 656.097, 358.596,
      0, 0, 1]
   camera_dist_coeffs: [-0.238605, 0.0435143, 0.000366211, -0.00272751, 0]  #k1, k2, p1, p2, k3

   # Fine extrinsic value. form imu2camera calibration.
   camera_ext_R:
         [0.999998,  0.00183758, 0.000849753,
         0.00184018,   -0.999994, -0.00307635,
         0.000844095,  0.00307791,   -0.999995]
   camera_ext_t: [0.0993128, 0.0117891, -0.176605] 

Do you mean that you only need to calibrate the lidar and camera, and put it directly into the parameters of vio, don't need to calibrate the imu and camera? Thanks.

farhad-dalirani commented 1 year ago

Radar? do you mean LiDAR?

camera_ext_R and camera_ext_t are extrinsic parameters between Camera and LiDAR. However, pay attention camera_ext_R is from the camera to LiDAR. If your calibration software gives you camera_ext_R from LiDAR to camera, you need to use the inverse of it.

fanshixiong commented 1 year ago

@farhad-dalirani Can you provide your contact information, we can exchange specific questions, my email is: 443605729@qq.com.