RobustFieldAutonomyLab / LeGO-LOAM

LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain
BSD 3-Clause "New" or "Revised" License
2.35k stars 1.11k forks source link

How to align the IMU to Lidar? && Problems with the KITTI dataset #153

Closed SakuraMemoryKnight closed 4 years ago

SakuraMemoryKnight commented 4 years ago

I was testing sequence00 on the kitti dataset, and what's happening is that if I don't use the IMU data from the dataset, it end up with a pretty good result, with a little bit of a bias, but when I add the IMU data, it end up with a pretty bad result.

The calibration file of the kitti dataset contains the calibration file of imu_to_velo, which contains a rotation matrix of 3X3 and a translation vector of 3X1. How can I add these data into the program to make the IMU and lidar align correctly?

One more thing, the introduction says “our current implementation for range image projection is only suitable for sensors that have evenly distributed channels. If the point cloud is not projected properly, you will lose many points and performance.” doesn't the 64-line lidar used in the Kitti dataset have evenly distributed channels? If so, will it affect the loop detection effect of the algorithm? Because when I tested the kitti dataset, it did not have the effect of loopback detection.

shadowdouble0 commented 4 years ago

Hi,I have the same question,do you solve it?

TixiaoShan commented 4 years ago

doesn't the 64-line lidar used in the Kitti dataset have evenly distributed channels? More information about that lidar can be found here https://velodynelidar.com/products/hdl-64e/

I believe this is the beam distribution for this lidar. https://github.com/ros-drivers/velodyne/blob/master/velodyne_pointcloud/tests/angles-calibrated.yaml

TixiaoShan commented 4 years ago

For IMU alignment, I didn't use it with KITTI dataset, sorry can't help you with that.

SakuraMemoryKnight commented 4 years ago

Thanks for your @TixiaoShan reply! I‘ll try to make a customized range image projection implementation.

SakuraMemoryKnight commented 4 years ago

The question about IMU, how do you make the IMU and lidar align correctly with your own dataset?Did the IMU and lidar positions almost coincide when you made your own data set?

TixiaoShan commented 4 years ago

@zcg3648806 Did the IMU and lidar positions almost coincide when you made your own data set? Yes, I tape the IMU right on top of the lidar.

SakuraMemoryKnight commented 4 years ago

Thanks for your @TixiaoShan reply! As you said, I think one of the reasons the imu of the kitii data set didn't perform well was that there was a distance between the imu and the lidar,it's my guess.

But I think it's not the main reason, when i run the kitti dataset , map construction jumps, and I don't know why yet. I'm trying to figure out why.

Again on the imu, when you make your own data set, is the xyz axis directions of the imu aligned with the lidar?

TixiaoShan commented 4 years ago

Thanks for your @TixiaoShan reply! As you said, I think one of the reasons the imu of the kitii data set didn't perform well was that there was a distance between the imu and the lidar,it's my guess.

But I think it's not the main reason, when i run the kitti dataset , map construction jumps, and I don't know why yet. I'm trying to figure out why.

Again on the imu, when you make your own data set, is the xyz axis directions of the imu aligned with the lidar?

Yes, my mounting of IMU strictly follows ROS standard. https://www.ros.org/reps/rep-0103.html https://www.ros.org/reps/rep-0105.html

14212094 commented 4 years ago

@zcg3648806 Hello!May i ask you where i can find the IMU data of sequence00 data of kitti dataset?thank you~

SakuraMemoryKnight commented 4 years ago

@14212094 Sequences are sourced from odometry datasets, and the odometry datasets are just subsets of the raw data. so If you want the imu data, you can download raw data directly.The raw data for each sequence is listed below(The following Numbers represent the sequence of frames from the raw data)

00: 2011_10_03_drive_0027 000000 004540 01: 2011_10_03_drive_0042 000000 001100 02: 2011_10_03_drive_0034 000000 004660 03: 2011_09_26_drive_0067 000000 000800 04: 2011_09_30_drive_0016 000000 000270 05: 2011_09_30_drive_0018 000000 002760 06: 2011_09_30_drive_0020 000000 001100 07: 2011_09_30_drive_0027 000000 001100 08: 2011_09_30_drive_0028 001100 005170 09: 2011_09_30_drive_0033 000000 001590 10: 2011_09_30_drive_0034 000000 001200

14212094 commented 4 years ago

@zcg3648806 OK!Thank you so much~

gkharish commented 4 years ago

Hello @zcg3648806 and @TixiaoShan , I am also trying to understand the impact (or benefits ) of imu in this algorithm. I simply run the algorithm with the developer's sample data (https://drive.google.com/open?id=1rNmNcpkVE2NhYKftosK1wieVnBl569AG) in two scenarios:

  1. Only velodyne: rosbag play --clock *.bag --topics /velodyne_points
  2. Imu with velodyne: rosbag play --clock *.bag --topics /velodyne_points /imu/data I plot the result comparing the trajectories (/key_pose_origin in camera_init frame) in the two cases. Here is the result. I was not expecting so much drift in the two trajectories because imu is aligned with the lidar (see Y-axis). Am I missing some thing?

NOTE:= To have a tf between the base_link and the velodyne, I am using a static transform publisher in the launch file :
node pkg="tf" type="static_transform_publisher" name="base_to_velodyne" args="0.070 0.030 0.364 0 0 0 /base_link /velodyne 50"

When I use my own setup similar to the developer's one (imu data from an onboard pixhawk mounted right on the top of a VLP16), addition of Imu only degrades the performance. Thanks for the help!

jmachuca77 commented 4 years ago

Hi @gkharish I am trying to do something similar right now, I am running a quanergy M8 Lidar, running this package without an IMU is not giving me very good results, it starts ok but then it gets completely lost and goes in circles video I am hoping to add the IMU from the pixhawk to try to fix this. What software are you running on the pixhawk? Ardupilot or PX4? and I assume you are running MAVROS? And what rate are you publishing the IMU information at? I am trying to publish at 100hz and feed that into this package via mavros.

TixiaoShan commented 4 years ago

@gkharish @jmachuca77 Even though the IMU and Velodyne is not 100% aligned in the provided dataset, they are placed close enough. Also, make sure the IMU data follows the ROS standard frame, with x being forward, rotate around it will be roll. With y being left, rotate around it will be the pitch. With z being up, rotate around it will be yaw.

gkharish commented 4 years ago

@jmachuca77 , Sorry for the late response.

  1. I couldn't access your video. The link says video unavailable.
  2. I use pixhawn and PX4 (and not ardupilot) and mavros to get the data in ros format
  3. Imu data from mavros is coming at 100Hz and in ROS standard frame (ENU) as described by TixiaoShan.
stale[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

chubukeji commented 2 years ago

@zcg3648806 hello,the 03: 2011_09_26_drive_0067 can't be found in official website. Do you have this dataset? Or,do you know why it can't be found? my Email 2661076951@qq.com