ethz-asl / segmap

A map representation based on 3D segments
BSD 3-Clause "New" or "Revised" License
1.05k stars 394 forks source link

Segmap on streaming LIDAR point cloud topics. #86

Open JustWon opened 6 years ago

JustWon commented 6 years ago

Currently, I am trying to build a map by the input from an ros node publishing LIDAR point cloud topics. I followed some issues regarding this.

https://github.com/ethz-asl/segmap/issues/81 https://github.com/ethz-asl/segmap/issues/69

I changed "use_odometry_information" to false "assembled_cloud_sub_topic" to /ns1/velodyne_points, in my case "sensor_frame" to left_velodyne "use_odom_factors" to false.

After operating the launch file, I got the following error.

image

After analyzing the code, I found out that the topic message related to the odom_frame is necessary because of the tflistener.waitForTransform function in LaserSlamWorker::scanCallback function.

So I tried to publish the static transform by adding the following code in a launch file.

but still I encountered the same error.

The following figure shows the rqt_graph when doing so. image

Have you tried Segmap for the streaming LIDAR topics?

rdube commented 6 years ago

@JustWon thanks for your interest! I never had the chance to test with a bag which did not contain odometry information. I believe that the best solution would be to adapt the LaserSlamWorker::scanCallback so that tf_listener_.waitForTransform is not called in case use_odometry_information is false.

Does that make sense? Would you have time to help us with this? I am happy to support and review a PR if you can implement this modification.

JustWon commented 6 years ago

I slightly modified a code to force to consider LIDAR input only.

image

It works but the estimated trajectory seems weird. image

It looks like a trajectory from a drone. and it seems that most of the point cloud or segments are eliminated...

I guess I should control some parameters on input filters.

rdube commented 6 years ago

haha yes that looks wrong indeed! Keep in mind that the core of this library is the segment based localization functionality. The ICP based LiDAR odometry is only offered as a reference / starting point for the demonstration. Also, in general I would suggest to supplement the LiDAR with an IMU whenever possible.

In order to get better LiDAR-only odometry results, you should definitely make sure that the laser_slam backend runs in real-time (and can process every scan). If that isn't the case it will indeed have a hard time to converge to a good solution. For this you can experiment with the different laser_slam parameters: I.E. https://github.com/ethz-asl/segmap/blob/master/segmapper/launch/kitti/kitti_loop_closure.yaml#L106

and also importantly the input filters: https://github.com/ethz-asl/segmap/blob/master/segmapper/launch/kitti/input_filters_outdoor.yaml

and ICP configurations: https://github.com/ethz-asl/segmap/blob/master/segmapper/launch/kitti/icp_dynamic_outdoor.yaml

Hope that helps!

rdube commented 6 years ago

Also, first try visualizing the local map point cloud /segmapper/local_map It will not help to look at the segments until this local map isn't clean.

JustWon commented 6 years ago

I looked into the topics from 2011_10_03_drive_27.bag file via rqt_bag. and I visualized the x,y,z values from a translation in tf topic (I think it is from IMU sensor since the child_frame_id indicates imu) 8d51ecdd-b213-4db9-8e25-98878cbeffe6

I guess the tf_listener in LaserSlamWorker might subscribe this tf topic from the .bag file. what do those values mean?

I thought that the value from an imu sensor means relative transformations between consecutive time stamps but it looks the poses in world frame. Which one is right? or does it suggest other meaning?

danieldugas commented 6 years ago

This curve likely shows the position of the approximate tf transform integrated from instantaneous IMU values. How exactly it is generated depends on the IMU sensor driver.

JustWon commented 6 years ago

I tested Segmap on KAIST Urban Dataset using Lidar data only.

image

It works but not consistent, I guess.. due to the ignoring of the odometry (from IMU or GPU) data?

rdube commented 6 years ago

@JustWon nice! Thanks for sharing this! I believe that more consistent results could be achievable by integrating IMU and / or tuning the LiDAR odometry part (or also considering other techniques like LOAM for example). Then it might help to adjust the place recognition parametrization.

Are you planning on integrating the IMU measurements? What are your next steps?

JustWon commented 6 years ago

@rdube Thanks for your reply, and yes, I am planning on integrating the IMU measurements but first I guess I should generate the odometry information in the global coordinate from the relative IMU measurement.

and I hope to utilize a semantic information when detecting the loop closures. Currently, I'm attending CVPR 2018 in Salt Lake City and I am interested in "Semantic Visual Localization" paper which will be presented in the afternoon. Interestingly the paper is from ETH as well.

rdube commented 6 years ago

@JustWon yes this is an interesting work!

TuskAW commented 5 years ago

@rdube #48 talks about changing SegMap's SLAM backend. Is that possible with the latest baseline? I was looking to replace ICP lidar odometry with LOAM's odometry. laser_slam_worker subscribes to the tf published in the kitti bag files; although it may sound ridiculous, will it suffice to subscribe to the tf published by LOAM? I'd appreciate any insight you may be able to provide.

gzyuaner commented 5 years ago

I tested Segmap on KAIST Urban Dataset using Lidar data only.

image

It works but not consistent, I guess.. due to the ignoring of the odometry (from IMU or GPU) data?

@JustWon Could you tell me how did you use point cloud data only to get tf ? Thanks a lot.