introlab / rtabmap_ros

RTAB-Map's ROS package.
http://wiki.ros.org/rtabmap_ros
BSD 3-Clause "New" or "Revised" License
999 stars 558 forks source link

Parameter "RGBD/AngularSpeedUpdate" seems to have no effect when mapping with rtabmap_ros #392

Closed gurbain closed 4 years ago

gurbain commented 4 years ago

Hi!

I am pretty new with RTAB-Map and I encounter some trouble to parameterize it correctly. It might not be the best place to ask about it but I notice that most of my concerns were solved in this list of issues, so I'm giving it a try.

I'm using two depth cameras Intel RealSense D435 pointing at the front and at the back of a wheeled robot to achieve mapping. I run it now in simulation but I have the robot counterpart to test in real when it works fine. If I understand my problem correctly from the image below, the gazebo plugin or the data transport is quite slow, leading to some latency between the odometry and the image capture. This delay causes an angular bias when the robot is rotating. This does not happen if I use a simple laserscan (not available on the real robot), so I don't think it is due to the odometry. However, it seems to happen with the classical depth camera plugin, so I do not think that the problem is specific to the realsense plugin neither.

Screenshot from 2020-03-17 15-13-28 Screenshot from 2020-03-17 14-55-17 In the two images above, we can see the sensor inputs in rviz when the robot is moving linearly (top) and rotating (bottom). On the bottom picture, there is a clear angular bias between the 3D point cloud (grey) and the laserscan (red) -- or the ground truth. This disappears when the robot angular speed is zero (top).

The mapping process suffers from this effect, and the different clouds do not superpose correctly as pointed out in the image below.

Screenshot from 2020-03-17 14-45-48

By looking at the many parameters of RTAB-Map, I've noticed that these one could help me:

Param: RGBD/AngularSpeedUpdate = "0.0"                     [Maximum angular speed (rad/s) to update the map (0 means not limit).]
Param: RGBD/LinearSpeedUpdate = "0.0"                      [Maximum linear speed (m/s) to update the map (0 means not limit).]

However, in the launch file enclosed below (which summarizes what I do in the different launch files of my ROS topology), it seems to have no effect whatsoever. As I could not find more documentation than what is printed by "rosrun rtabmap_ros rtabmap --params", I'm wondering if I understand them correctly and if I use them appropriately.

Does anyone have an idea of what I'm doing wrong with the parameters of RTAB-Map? Thanks a lot!

Gabriel

slam.launch

matlabbe commented 4 years ago

Hi Gabriel,

those parameters are used to avoid adding a new node to map if the camera is moving and/or rotating too fast. For example, this has been tested in case of RTAB-Map Tango where we didn't want to record images when the camera was rotating fast, which was causing motion blur in RGB images.

In your case, the problem is sensor synchronization. By default if a laser scan is used, it will use the timestamp of the laser scan to get the current odometry pose. If the timestamp of the images are off, it is likely that it will cause the problem you are seeing. You may try to enable parameter odom_sensor_sync:

<node name="rtabmap" pkg="rtabmap_ros"  type="rtabmap" output="screen" args="--delete_db_on_start">
        <param name="odom_sensor_sync"       type="bool"    value="true"/>
       ...
</node>

This will try to transform the camera frame accordingly to lidar stamp. It would reduce the problem you are seeing.

However, it may not be perfect as long as approximate synchonization is used in rtabmap node. If you are not using a lidar on the real robot, are your D435 cameras hardware sycnhronized together? At least, you should set approx_sync to false for rgbd_sync nodelets. If the cameras could be hardware synchronized, you could set approx_sync of rtabmap node to false too. I would set also odom_frame_id of rtabmap node to use odometry from TF to avoid synchronizing another topic.

In a perfect world, all sensors would be exactly synchronized. In rtabmap, we prioritize laser scans synchronization with odometry over images, because we assume that someone using a lidar would use only the map created with the lidar, the images would be used only for loop closure detection, which is not critical that images are not perfectly aligned with odometry.

cheers, Mathieu

gurbain commented 4 years ago

Hi @matlabbe,

Thank you for your detailed answer and sorry for the delay. My work has been a bit disrupted due to the pandemic in the last months and I realize now that I never pointed out what solved the matter for me.

Actually, the excessive un-synchronicity came from an inefficient gazebo plugin implementation for the D435 cameras I was using. This was causing a huge delay between the image acquisition and the time stamping (in this code snippet if interested), hence the issue.

On the real robot, I am not using the lidar so I have not tried out the odom_sensor_sync as suggested. Also, they are currently not hardware synchronized but the results look fine for my application at the moment (although I cannot be 100% sure because I have another problem with the odometry, which is not related to rtabmap).

I will close the issue then!