introlab / rtabmap_ros

RTAB-Map's ROS package.
http://wiki.ros.org/rtabmap_ros
BSD 3-Clause "New" or "Revised" License
1.01k stars 558 forks source link

Using RTABmap as a frontend without map #723

Open stevemartinov opened 2 years ago

stevemartinov commented 2 years ago

Hi,

I am using OpenVSLAM to create a map and then I go to localization mode. The problem is that my odometry drifts quite a bit and localization does not occur often and I need my robot to be as accurate as possible between each localizations (so it does not drift too much before the localization/correction happens).

Luckily, I do have wheel odometry, LiDAR and RGBD camera on my robot. What is the best way to use those sensors to make the odometry as accurate as possbile?

  1. If I will use only wheel odometry and LiDAR, how reliable is it? Will odometry go crazy if lidar for example does not detect anything (big open space)?
  2. If I use all sensors, can I stop creating a map and be in odometry mode? Can RTABmap use RGBD as Visual Odometry?
matlabbe commented 2 years ago

Combined wheel odometry with LiDAR (with rtabmap's icp_odometry) can be pretty strong in most indoor environments (when LiDAR sees walls and enough complex geometry in its range). In open space, it depends on the lidar odometry approach, but yes if it has a strong bias on lidar data, it may get lost. With rtabmap, we can feed it wheel odometry and enable RGBD/NeighborLinkRefining to refine odometry with lidar. If lidar cannot see anything, rtabmap is still taking wheel odometry for new poses added to graph (poses are just not refined but they can be used). But this approach assumes you are using rtabmap as back-end.

Another approach would be to use robot_localization to fuse wheel odometry and icp_odometry output. Set Odom/ResetCountdown=1, publish_tf=false and publish_null_when_lost=false. icp_odometry will then publish odometry topic only when a twist can be computed, in robot_localization you would fuse only the twist.

To be only in odometry mode, just start icp_odometry, rgbd_odometry or stereo_odometry node. If you use rtabmap node, it can use RGB-D input. You may look at this page to see some examples on how rtabmap can use different odometry and sensors.

stevemartinov commented 2 years ago

So there is no way to enable a robust odometry prediction having LiDAR, wheel odometry and Stereo camera? The downside of using the robot_localization is that it requires covariance and I am not sure if the ICP odometry has accurate covariance

stevemartinov commented 2 years ago

Also, can you have a look at my bag file? For some reason, icp_odometry from rtabmap does not seem to work. The scan topic is /scan_front_filtered

https://drive.google.com/file/d/1tAbHbV0-stmV9cENFiwF2kvGdUWleKqU/view?usp=sharing

matlabbe commented 2 years ago

Well, there is a node called rgbdicp_odometry, which uses camera and lidar, with possibly wheel odometry. However, I don't think it does what you want, as it mainly does this pipeline: wheel odometry -> visual odometry -> icp odometry. In case visual registration fails, it still does icp registration with the wheel odomery as guess instead of the visual one. In case icp registration fails, the whole output fails even if the visual registration worked (we would have to modify here to handle the case when new t is null while the previous t was okay, so reverting back to previous t).

I tried your bag with those commands:

roslaunch rtabmap_ros rtabmap.launch \
   icp_odometry:=true \
   scan_topic:=/scan_front_filtered \
   subscribe_rgb:=false \
   depth:=false \
   frame_id:=lidar_front \
   args:="-d --Reg/Strategy 1 --Reg/Force3DoF true --Icp/MaxTranslation 0.5" \
   odom_args:="--Icp/CorrespondenceRatio 0.01"  \
   use_sim_time:=true \
   subscribe_scan:=true \
   imu_topic:=not_used

cd ~/Downloads/dbs_rgbd_wheel
rosbag play --clock  *

I had to set explicitly imu_topic to not_used because the imu inside the bag is called /imu/data (the same name rtabmap is using by default), as it has frame_id not set and no static transforms were recorded in the bag.

Screenshot from 2022-03-06 19-54-39

stevemartinov commented 2 years ago

Hi,

I added stereo camera and do a mapping but in rtabmap viewer, the graph view is just white with trajectory, why is that? If I remove stereo, then I see the occupancy map

matlabbe commented 2 years ago

Is rtabmap still subscribed to lidar topic? Here an example based on the previous one with a stereo camera:

roslaunch rtabmap_ros rtabmap.launch \
   icp_odometry:=true \
   scan_topic:=/scan_front_filtered \
   left_image_topic:=/stereo/left/image_rect_color \
   right_image_topic:=/stereo/right/image_rect \
   left_camera_info_topic:=/stereo/left/camera_info \
   right_camera_info_topic:=/stereo/right/camera_info \
   stereo:=true \
   rgbd_sync:=true \
   approx_rgbd_sync:=false \
   subscribe_rgbd:=true \
   frame_id:=lidar_front \
   args:="-d --Reg/Strategy 1 --Reg/Force3DoF true --Icp/MaxTranslation 0.5" \
   odom_args:="--Icp/CorrespondenceRatio 0.01"  \
   use_sim_time:=true \
   subscribe_scan:=true \
   imu_topic:=not_used

The frame_id would be changed to base frame of the robot. Then make sure your stereo camera is linked in TF tree to base frame. This setup is similar to this one, but with a stereo camera instead of RGB-D camera. Screenshot from 2022-04-10 18-08-42