introlab / rtabmap_ros

RTAB-Map's ROS package.
http://wiki.ros.org/rtabmap_ros
BSD 3-Clause "New" or "Revised" License
916 stars 549 forks source link

How to get the rtabmap slam working and correct the pose estimate for base_link using Robot_localization Odom #1149

Open Rak-r opened 2 months ago

Rak-r commented 2 months ago

i have been going through rtabmap's documentation and deployed on the real robot for doing slam. I have RGBD intel D435 camera. My setup involves translation pose (x, y) from optical flow sensor, using steering angle for yaw but it might not be accuarte so fusing rtabmap's rgbd_odometry with just Vyaw in the EKF. I just wanted to add the steering angle to handle the case where visual odometry lost for my Ackermann steered robot.

I got this setup by following :

  1. Get the optical flow sensor and steering angle to publish information on a single odometry message with pose.x, pose.y from optical flow sensor and orientation.z from steering angle, this makes an odometry message. The sensor is mounted on base_link sp static transform is published by robot state publisher.

  2. Now, using rtbamap's vo to just get the yaw information and I am just using velocity in yaw. publsih_tf is set to false.

  3. In the robot_localization ekf, I feed pose.x, pose.y and orinetation.yaw from sensor and steering angle and Vyaw from visual odometry and this is verified that just ekf is publishing odom->base_link transform.

  4. I launch the rtabmap slam with subscribing to odometry/filtered topic.

  5. I visualize in rviz2, when going forward, it seems okish but as soon as I turning it seems the robot is turning but sometimes touches the obstacle meaning very close to the black area marked in 2D occupancy grid map but in real it is a bit far away.

  6. Monitoring the yaw angle, from EKF , VO and steering, I can say that EKF no believing steering heavily because I set the measurement covariance in my odometry message for absolute yaw to 10000.0, and rtabmap slam outputs localization/pose which also seems somewhere in between of VO's and EKF's yaw.

Is this the correct way for setting this scenario. Is rtabmap slam actually correcting the pose because sometimes I see the robot jumping slightly which I thought means correcting the drift?

Issue

If robot makes a sharp turn to the right, say nearly in a orthogonal way (going forward and turning with full steering lock), in rviz, I see it starts moving sideways. If I start to take a full U-turn and now the robot facing the back area which is not mapped, in rviz, I see robot going backwards while in real it going forward.

Is it normal case of visualization or something is wrong here?

params for rtabmap's RGBD Odometry

`

   parameters={
      'frame_id':'base_link',
      'use_sim_time':use_sim_time,
      'subscribe_depth':True,
       'approx_sync': True,

      'publish_tf': False,
      'Reg/Force3DoF':'true',
      'publish_null_when_lost': False,
      'Grid/RangeMin': '0.3',
      'Grid/RangeMax': '5.0',
      'Optimizer/GravitySigma':'0' # Disable imu constraints (we are already in 2D)
}

`

params for rtabmap's Slam

`

    slam_param = {

      'use_sim_time':use_sim_time,
      'subscribe_depth':True,
      'subscribe_scan':True,
    #   'subscribe_scan_cloud': True,
      'approx_sync': True,
    #   'approx_sync_max_interval': 0.01,
      'publish_tf': True,
      'Reg/Force3DoF':'true',                                                       # publish only 2D pose
      'map_always_update': False,                                                    # always update the map not only when the robot moves
      'Grid/RangeMin': '0.3',                                                       # min range of depth
      'Grid/RangeMax': '5.0',                                                       # max range of depth (intel D435 looses accuaracy at 10 m)
      'Grid/RayTracing': 'true',                                                    # enable ray-tracing to clear out cells and mark as free space in occupancy grid map
      'Grid/FromDepth': 'true',                                                     # generate occupancy ap using depth image 
      'RGBD/OptimizeMxError': '0.1',                                                # to reject the wrong loop closures
      'RGBD/ProximityBySpace': 'false',
      'RGBD/AngularUpdate': '0.01',
      'RGBD/LinearUpdate': '0.01',
    #   'use_action_for_goal':True,
    #   'qos_image':qos,
    #   'qos_imu':qos,
      'Optimizer/GravitySigma':'0'                                                  # Disable imu constraints (we are already in 2D)
}

`

Main params for EKF

`

    odom0: /optical_flow_odom
    odom0_config: [true,  true,  false,
                   false, false, true,
                   false, false, false,
                   false, false, false,
                   false, false, false]

    odom0_differential: false

    odom1: /rtabmap_odom

    odom1_config: [false,  false,  false,
                   false, false, false,
                   false, false, false,
                   false, false, true,
                   false, false, false]

    process_noise_covariance: [0.005, 0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0005, 0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.06, 0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.03, 0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.03, 0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.06, 0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.04, 0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.04, 0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.04, 0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.01, 0.0,    0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.01, 0.0,    0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.02, 0.0,    0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.01, 0.0,    0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.01, 0.0,
                           0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.015]

`

Checked

  1. No other node is publishing transfrom odom->base_link except for EKF
  2. Verified that optical flow sensor measurements are correct fro going in linear motion.

Any suggestion to get this working properly will be highly appreciated. @matlabbe

Rak-r commented 2 months ago

Update:

I just turned off everything and tested with just rtab's odometry with same params as above and rtab' slam. No EKF and No odometry from sensor.

I again did that sharp turn and this time robot appears to behave what I mentioned above. ( It is turning instead of sliding sideways which it was doing previously). This means that with my sensor and steering angle fused wrongly with ekf. @matlabbe could you provide some insight on this based on above params. I can say that odometry from my sensor is good for translation. I just need to handle the case where rtab's VO is lost and for that just using steering angle fused in ekf and made EKF not to believe on steering much.

Is this something to do on how I am fusing odmetry data from sensor mounted on the base_link for which robot_state_publihser publishes the static transform and also for the camera.

Rak-r commented 2 months ago

I tried to fuse my sensor's translation only (pose.x and pose.y) and use rtab's rgbd-odometry Vyaw. Fed this setup to the EKF and tried the same above mentioned test. At the start, it seems okish , the pose.x values from my sensor and from rgbd odometry has difference of (0.4m-0.5m), when I drive straight.

But as soon as I try to make the U-turn, the robot facing is correct this time but there is huge difference between EKF'S estimated pose and rgbd's odometry. EKF saying 2.4, 2.5, something values while the rgbd odometry reports 0.24, 0.26 for pose.x.

I believe I am getting confused with the setup and EKF is making things wierd.

Also, in rgbd odometry, the parameter odom_frame_id was default set to odom and I removedit and let it blank. But still nothing changed. Need advice on this wierd behaviour. @matlabbe

Rak-r commented 1 month ago

Update:

I played with my sensor output and came to know that y values of my sensor do not behave well with REP103, even after negating it. For now, I just passed the pose.x from my sensor data and absolute pose.x pose.y and orientation.yaw from RGBD odometry together in the EKF. Now, I performed the same U turn, and the side drift is gone, a bit more tweaking required for my sensor data and VO to output nearly same measurements for pose.x.

If somebody is struggling with the same setup of fusion with EKF, just follow below steps.

  1. Firstly check the Rtabmap slam with just the RTAB'S RGBD oodmetry and verify it behaves correct for indoors.

  2. Now, if fusing the data from other sensors and RGBD odometry, turn of the transform publisher parameter in RGBD odometry. publish_tf : False for odom->base_link

  3. Make sure that your sensor node / wheel odmetry node not publishing tf from odom->base_link.

  4. Add the odom topics or whatever your node outputs, to the EKF and use absolute poses from the one which you are more confident and use velocities from the one which is less stable. Example: VO gets lost with white walls , at that time just need other source to handle the cases.

  5. Then, in RGBD odometry launch file, set the publish_null_when_lost : False and 'Odom/ResetCountdown': '1' to immediately reset and use the other source for measurement.

  6. Run rviz/rviz2 and visualize its behaving as similar as the case with just RGBD odometry in global frame set to odom, small difference will be fine.

  7. Now, in the RTABMAP's Slam launch file, remap the topic odom to EKF's output topic usually odometry/filtered and start the slam, change the global frame to map, if want test your ways or try as the one mentioned in my issue.

  8. There might be need to tweak EKF's process noise covariances.

The other way is to just use RGBD odometry and using the guess_frame_id parameter which behaves similar to handle the lost situation.

  1. Set frame id to vo if want to use RGBD oodmetry as like as ekf and set guess_frame_id to odom & keep publishing odom -> base_link from your sensor or wheel odometry

  2. This will not break the tf tree and new setup would be like vo -> odom -> base_link.

  3. When RTABMAP's Slam is launched the tf tree would look like map -> vo -> odom -> base_link.

matlabbe commented 1 month ago

I think the optical flow odometry and steering angle is wrongly set in their respective odometry topic. Is it really giving pose.x and pose.y globally or relative to base frame? You may want to set the delta value as a velocity in base frame instead of pose. For the steering angle, same thing, the steering angle is independent of the global yaw, you would have to transform it as a yaw velocity instead of a yaw orientation.

Rak-r commented 1 month ago

The optic flow sensor is static to base link and robot state publisher do the static transform between optical flow frame and base link as parent frame.

The odometry message which i generate, I set frame id as odom and child frame id as base link.

I also feed steering angle in the same odometry message in the yaw field. Then this message is fed to the EKF to handle some uncertainity. @matlabbe

I tried with just pose.x from this odometry message and absolute pose from RGBD odometry to EKF. Is it wrong? I also tested with velocity in x from my this sensor odometry and absolute pose from RGBD, they seem to have different values but the side drifting is gone. Now the values increase/ decrease in same direction.

Rak-r commented 1 month ago

By pose.x and pose.y globally means ? The message is in odom frame and rtab's slam does the correction in map frame over the topic / localisation_pose.

I also checked my sensor's y values it is sometimes while turning right reports positive values which is wrong as per the REP105. So I turned off the y from my sensor and using RGBD's absolute pose info. @matlabbe

matlabbe commented 1 month ago

By globally, I meant in reference to same fixed frame (e.g., odom). Independently of the orientation of the robot, x-y values refer to same fixed frame. For example, if robot moves forward 1 meter (x=1,y=0), turn 90 deg clockwise and move another 1 meter forward, the resulting pose should be x=1,y=1, not x=2, y=0. If your optical flow sensor gives x-y delta value from previous pose, transform them as velocity and fuse velocity instead. For the steering angle, if it says "+30 deg", it is relative to base frame, not odom frame, so it cannot be used "as is" in pose of odometry message. It could be translated to a velocity and use the twist in odometry message instead and fuse vyaw.

In definition of the odometry message, you will see that pose is in frame_id and twist is in child_frame_id.

Rak-r commented 1 month ago

Ok. What I am doing is taking those dx and dy and feeding the absolute pose to odometry.pose.x and pose.y by this abs_x += dx and abs_y += dy and also I calculate the velocity using vx = dx/dt and vy = dy/dt .

For the steering angle, I find the steering angle from my feedback messages from the linear actuator and then using:

Vyaw = Vx*tan(current_angle)/wheelbase delta_theta = Vyaw*delta_time Absolute_theta +=delta_theta

Then generate the odometry message:

odom_msg.header.stamp = self.get_clock().now().to_msg()
 odom_msg.header.frame_id = 'odom'
 odom_msg.child_frame_id = 'base_link'
 odom_msg.pose.pose.position.x = abs_x
  odom_msg.pose.pose.position.y = abs_y
odom_msg.twist.twist.linear.x = vx
odom_msg.twist.twist.angular.z = vyaw

The vx used to find angular yaw is the same which the optical flow reports from the above calculation and what you just mentioned. I am getting confused with frame settings. Do I need to set optical flow frame as child frame in odom message with parent frame as odom, so that absolute pose data represented and in odom frame with respect to optical flow frame. and but the twist data is represented in optical flow frame instead of base link. Correct me if I am getting something wrong here or missing something. @matlabbe

matlabbe commented 1 month ago

It should be base_link frame (ideally same frame than rgbd_odometry is using).

I would assume vx is in base frame, you cannot just do abs_x += dx, because abs_x should be in odom frame. Otherwise, dx should be transformed in global orientation in odom frame before summing them together. For robot_localization, you cannot fuse pose of two odometry messages (see http://docs.ros.org/en/melodic/api/robot_localization/html/preparing_sensor_data.html#odometry), you should convert them as differential, which implicitly convert poses to velocity and fuse the velocities in base frame. You can then fuse the twist of your wheel odometry directly instead. You may have to set differential for rgbd odometry in case it is lost or use only its twist.