introlab / rtabmap_ros

RTAB-Map's ROS package.
http://wiki.ros.org/rtabmap_ros
BSD 3-Clause "New" or "Revised" License
927 stars 551 forks source link

Lidar and RGBD (or stereo) SLAM/Localization #556

Open LSLubanco opened 3 years ago

LSLubanco commented 3 years ago

Dear @matlabbe thank you and your team for providing this extremely useful package.

After reading the documentation, my understanding is that the algorithm accepts different sources for odometry (or even odometry calculated externally through e.g. robot localization package) and it uses that information along with stereo or rgbd data in order to perform SLAM. Therefore, the loop closure is performed based ultmostly through the environment appearances along the respective spatial information. Consequently, my understanding is that it is possible to use a myriad of sensors in the odometry step, but for SLAM (loop closure), only visual information is used along with the odometry. Is my interpretation correct?

I would like to know if it is possible to use both lidar and visual information for loop closure as well as for the localization in the localization only mode. (The figure below shows the configuration which I would desire to have, the blue lines are connections that I know that are supported, and the red line is the connection which I am not sure whether is supported)

rtab_flux

Additional information 1: I tested the algorithm outdoor using visual slam and wheel/IMU odometry which genereated "good" results, however when I performed the localization only, the results were poor (due to assuming a wrong initial location, i.e. the last location during SLAM). Does the algorithm assume an uniform distribution over the map initially? Moreover, I believe if I could integrate the 2d lidar "directly" on the SLAM as well as on the localization only, I would have a better performance. Additional information 2: Hardware used in my application - 2d lidar (360 deg.), Imu, wheel encoder, realsense d435 camera.

All the best

matlabbe commented 3 years ago

We can feed 2D lidar to rtabmap node, set subscribe_scan parameter to true and remap scan topic. There is an example here. In localization mode, once the robot is localized visually (through a global visual loop closure), it can use 2D lidar for proximity detection (independently of visual information) based on current odometry. See Section 3.2 of this paper: https://introlab.3it.usherbrooke.ca/mediawiki-introlab/images/7/7a/Labbe18JFR_preprint.pdf

cheers, Mathieu