Open Rak-r opened 2 months ago
I just turned off everything and tested with just rtab's odometry with same params as above and rtab' slam. No EKF and No odometry from sensor.
I again did that sharp turn and this time robot appears to behave what I mentioned above. ( It is turning instead of sliding sideways which it was doing previously). This means that with my sensor and steering angle fused wrongly with ekf. @matlabbe could you provide some insight on this based on above params. I can say that odometry from my sensor is good for translation. I just need to handle the case where rtab's VO is lost and for that just using steering angle fused in ekf and made EKF not to believe on steering much.
Is this something to do on how I am fusing odmetry data from sensor mounted on the base_link for which robot_state_publihser publishes the static transform and also for the camera.
I tried to fuse my sensor's translation only (pose.x and pose.y) and use rtab's rgbd-odometry Vyaw
. Fed this setup to the EKF and tried the same above mentioned test. At the start, it seems okish , the pose.x values from my sensor and from rgbd odometry has difference of (0.4m-0.5m), when I drive straight.
But as soon as I try to make the U-turn, the robot facing is correct this time but there is huge difference between EKF'S estimated pose and rgbd's odometry. EKF saying 2.4, 2.5, something values while the rgbd odometry reports 0.24, 0.26 for pose.x
.
I believe I am getting confused with the setup and EKF is making things wierd.
Also, in rgbd odometry, the parameter odom_frame_id
was default set to odom and I removedit and let it blank. But still nothing changed. Need advice on this wierd behaviour. @matlabbe
Update:
I played with my sensor output and came to know that y values of my sensor do not behave well with REP103, even after negating it. For now, I just passed the pose.x from my sensor data and absolute pose.x pose.y and orientation.yaw from RGBD odometry together in the EKF. Now, I performed the same U turn, and the side drift is gone, a bit more tweaking required for my sensor data and VO to output nearly same measurements for pose.x.
If somebody is struggling with the same setup of fusion with EKF, just follow below steps.
Firstly check the Rtabmap slam with just the RTAB'S RGBD oodmetry and verify it behaves correct for indoors.
Now, if fusing the data from other sensors and RGBD odometry, turn of the transform publisher parameter in RGBD odometry. publish_tf : False
for odom->base_link
Make sure that your sensor node / wheel odmetry node not publishing tf from odom->base_link
.
Add the odom topics or whatever your node outputs, to the EKF and use absolute poses from the one which you are more confident and use velocities from the one which is less stable. Example: VO gets lost with white walls , at that time just need other source to handle the cases.
Then, in RGBD odometry launch file, set the publish_null_when_lost : False
and 'Odom/ResetCountdown': '1'
to immediately reset and use the other source for measurement.
Run rviz/rviz2 and visualize its behaving as similar as the case with just RGBD odometry in global frame set to odom, small difference will be fine.
Now, in the RTABMAP's Slam launch file, remap the topic odom
to EKF's output topic usually odometry/filtered
and start the slam, change the global frame to map, if want test your ways or try as the one mentioned in my issue.
There might be need to tweak EKF's process noise covariances.
guess_frame_id
parameter which behaves similar to handle the lost situation.Set frame id to vo
if want to use RGBD oodmetry as like as ekf and set guess_frame_id
to odom
& keep publishing odom -> base_link
from your sensor or wheel odometry
This will not break the tf tree and new setup would be like vo -> odom -> base_link
.
When RTABMAP's Slam is launched the tf tree would look like map -> vo -> odom -> base_link
.
I think the optical flow odometry and steering angle is wrongly set in their respective odometry topic. Is it really giving pose.x and pose.y globally or relative to base frame? You may want to set the delta value as a velocity in base frame instead of pose. For the steering angle, same thing, the steering angle is independent of the global yaw, you would have to transform it as a yaw velocity instead of a yaw orientation.
The optic flow sensor is static to base link and robot state publisher do the static transform between optical flow frame and base link as parent frame.
The odometry message which i generate, I set frame id as odom and child frame id as base link.
I also feed steering angle in the same odometry message in the yaw field. Then this message is fed to the EKF to handle some uncertainity. @matlabbe
I tried with just pose.x from this odometry message and absolute pose from RGBD odometry to EKF. Is it wrong? I also tested with velocity in x from my this sensor odometry and absolute pose from RGBD, they seem to have different values but the side drifting is gone. Now the values increase/ decrease in same direction.
By pose.x and pose.y globally means ? The message is in odom frame and rtab's slam does the correction in map frame over the topic / localisation_pose.
I also checked my sensor's y values it is sometimes while turning right reports positive values which is wrong as per the REP105. So I turned off the y from my sensor and using RGBD's absolute pose info. @matlabbe
By globally, I meant in reference to same fixed frame (e.g., odom). Independently of the orientation of the robot, x-y values refer to same fixed frame. For example, if robot moves forward 1 meter (x=1,y=0), turn 90 deg clockwise and move another 1 meter forward, the resulting pose should be x=1,y=1, not x=2, y=0. If your optical flow sensor gives x-y delta value from previous pose, transform them as velocity and fuse velocity instead. For the steering angle, if it says "+30 deg", it is relative to base frame, not odom frame, so it cannot be used "as is" in pose of odometry message. It could be translated to a velocity and use the twist in odometry message instead and fuse vyaw.
In definition of the odometry message, you will see that pose is in frame_id
and twist is in child_frame_id
.
Ok. What I am doing is taking those dx and dy and feeding the absolute pose to odometry.pose.x and pose.y
by this abs_x += dx
and abs_y += dy
and also I calculate the velocity using vx = dx/dt
and vy = dy/dt
.
For the steering angle, I find the steering angle from my feedback messages from the linear actuator and then using:
Vyaw = Vx*tan(current_angle)/wheelbase
delta_theta = Vyaw*delta_time
Absolute_theta +=delta_theta
Then generate the odometry message:
odom_msg.header.stamp = self.get_clock().now().to_msg()
odom_msg.header.frame_id = 'odom'
odom_msg.child_frame_id = 'base_link'
odom_msg.pose.pose.position.x = abs_x
odom_msg.pose.pose.position.y = abs_y
odom_msg.twist.twist.linear.x = vx
odom_msg.twist.twist.angular.z = vyaw
The vx
used to find angular yaw is the same which the optical flow reports from the above calculation and what you just mentioned. I am getting confused with frame settings. Do I need to set optical flow frame
as child frame in odom message with parent frame as odom
, so that absolute pose data represented and in odom
frame with respect to optical flow frame
. and but the twist data is represented in optical flow frame
instead of base link
. Correct me if I am getting something wrong here or missing something.
@matlabbe
It should be base_link
frame (ideally same frame than rgbd_odometry is using).
I would assume vx is in base frame, you cannot just do abs_x += dx
, because abs_x
should be in odom
frame. Otherwise, dx
should be transformed in global orientation in odom frame before summing them together. For robot_localization, you cannot fuse pose
of two odometry messages (see http://docs.ros.org/en/melodic/api/robot_localization/html/preparing_sensor_data.html#odometry), you should convert them as differential, which implicitly convert poses to velocity and fuse the velocities in base frame. You can then fuse the twist of your wheel odometry directly instead. You may have to set differential
for rgbd odometry in case it is lost or use only its twist.
i have been going through rtabmap's documentation and deployed on the real robot for doing slam. I have RGBD intel D435 camera. My setup involves translation pose (x, y) from optical flow sensor, using steering angle for yaw but it might not be accuarte so fusing rtabmap's rgbd_odometry with just
Vyaw
in theEKF
. I just wanted to add the steering angle to handle the case where visual odometry lost for my Ackermann steered robot.I got this setup by following :
Get the optical flow sensor and steering angle to publish information on a single odometry message with pose.x, pose.y from optical flow sensor and orientation.z from steering angle, this makes an odometry message. The sensor is mounted on base_link sp static transform is published by robot state publisher.
Now, using rtbamap's vo to just get the yaw information and I am just using velocity in yaw.
publsih_tf
is set to false.In the
robot_localization
ekf, I feed pose.x, pose.y and orinetation.yaw from sensor and steering angle and Vyaw from visual odometry and this is verified that just ekf is publishingodom->base_link
transform.I launch the rtabmap slam with subscribing to
odometry/filtered
topic.I visualize in rviz2, when going forward, it seems okish but as soon as I turning it seems the robot is turning but sometimes touches the obstacle meaning very close to the black area marked in 2D occupancy grid map but in real it is a bit far away.
Monitoring the yaw angle, from EKF , VO and steering, I can say that EKF no believing steering heavily because I set the measurement covariance in my odometry message for
absolute yaw
to10000.0
, and rtabmap slam outputslocalization/pose
which also seems somewhere in between ofVO
's andEKF
's yaw.Is this the correct way for setting this scenario. Is rtabmap slam actually correcting the pose because sometimes I see the robot jumping slightly which I thought means correcting the drift?
Issue
If robot makes a sharp turn to the right, say nearly in a orthogonal way (going forward and turning with full steering lock), in rviz, I see it starts moving sideways. If I start to take a full U-turn and now the robot facing the back area which is not mapped, in rviz, I see robot going backwards while in real it going forward.
Is it normal case of visualization or something is wrong here?
params for rtabmap's RGBD Odometry
`
`
params for rtabmap's Slam
`
`
Main params for EKF
`
`
Checked
odom->base_link
except forEKF
Any suggestion to get this working properly will be highly appreciated. @matlabbe