Open JustWon opened 6 years ago
@JustWon thanks for your interest! I never had the chance to test with a bag which did not contain odometry information. I believe that the best solution would be to adapt the LaserSlamWorker::scanCallback
so that tf_listener_.waitForTransform
is not called in case use_odometry_information
is false.
Does that make sense? Would you have time to help us with this? I am happy to support and review a PR if you can implement this modification.
I slightly modified a code to force to consider LIDAR input only.
It works but the estimated trajectory seems weird.
It looks like a trajectory from a drone. and it seems that most of the point cloud or segments are eliminated...
I guess I should control some parameters on input filters.
haha yes that looks wrong indeed! Keep in mind that the core of this library is the segment based localization functionality. The ICP based LiDAR odometry is only offered as a reference / starting point for the demonstration. Also, in general I would suggest to supplement the LiDAR with an IMU whenever possible.
In order to get better LiDAR-only odometry results, you should definitely make sure that the laser_slam backend runs in real-time (and can process every scan). If that isn't the case it will indeed have a hard time to converge to a good solution. For this you can experiment with the different laser_slam parameters: I.E. https://github.com/ethz-asl/segmap/blob/master/segmapper/launch/kitti/kitti_loop_closure.yaml#L106
and also importantly the input filters: https://github.com/ethz-asl/segmap/blob/master/segmapper/launch/kitti/input_filters_outdoor.yaml
and ICP configurations: https://github.com/ethz-asl/segmap/blob/master/segmapper/launch/kitti/icp_dynamic_outdoor.yaml
Hope that helps!
Also, first try visualizing the local map point cloud /segmapper/local_map
It will not help to look at the segments until this local map isn't clean.
I looked into the topics from 2011_10_03_drive_27.bag file via rqt_bag. and I visualized the x,y,z values from a translation in tf topic (I think it is from IMU sensor since the child_frame_id indicates imu)
I guess the tf_listener in LaserSlamWorker might subscribe this tf topic from the .bag file. what do those values mean?
I thought that the value from an imu sensor means relative transformations between consecutive time stamps but it looks the poses in world frame. Which one is right? or does it suggest other meaning?
This curve likely shows the position of the approximate tf transform integrated from instantaneous IMU values. How exactly it is generated depends on the IMU sensor driver.
I tested Segmap on KAIST Urban Dataset using Lidar data only.
It works but not consistent, I guess.. due to the ignoring of the odometry (from IMU or GPU) data?
@JustWon nice! Thanks for sharing this! I believe that more consistent results could be achievable by integrating IMU and / or tuning the LiDAR odometry part (or also considering other techniques like LOAM for example). Then it might help to adjust the place recognition parametrization.
Are you planning on integrating the IMU measurements? What are your next steps?
@rdube Thanks for your reply, and yes, I am planning on integrating the IMU measurements but first I guess I should generate the odometry information in the global coordinate from the relative IMU measurement.
and I hope to utilize a semantic information when detecting the loop closures. Currently, I'm attending CVPR 2018 in Salt Lake City and I am interested in "Semantic Visual Localization" paper which will be presented in the afternoon. Interestingly the paper is from ETH as well.
@JustWon yes this is an interesting work!
@rdube #48 talks about changing SegMap's SLAM backend. Is that possible with the latest baseline? I was looking to replace ICP lidar odometry with LOAM's odometry. laser_slam_worker subscribes to the tf published in the kitti bag files; although it may sound ridiculous, will it suffice to subscribe to the tf published by LOAM? I'd appreciate any insight you may be able to provide.
I tested Segmap on KAIST Urban Dataset using Lidar data only.
It works but not consistent, I guess.. due to the ignoring of the odometry (from IMU or GPU) data?
@JustWon Could you tell me how did you use point cloud data only to get tf ? Thanks a lot.
Currently, I am trying to build a map by the input from an ros node publishing LIDAR point cloud topics. I followed some issues regarding this.
https://github.com/ethz-asl/segmap/issues/81 https://github.com/ethz-asl/segmap/issues/69
I changed "use_odometry_information" to false "assembled_cloud_sub_topic" to /ns1/velodyne_points, in my case "sensor_frame" to left_velodyne "use_odom_factors" to false.
After operating the launch file, I got the following error.
After analyzing the code, I found out that the topic message related to the odom_frame is necessary because of the tflistener.waitForTransform function in LaserSlamWorker::scanCallback function.
So I tried to publish the static transform by adding the following code in a launch file.
but still I encountered the same error.
The following figure shows the rqt_graph when doing so.
Have you tried Segmap for the streaming LIDAR topics?