Closed mibcat closed 5 years ago
Here some screen shots during mapping:
Direct after starting the node:
drive only forward:
turn right and dive a little bit forward:
turn left:
drive straight backward:
turning left (180°):
In this very small and easy environment the resulting map looks not so bad but in a more complex environment the maps will be "destroyed" very easily due to the lost localization ...
Between all of your movements, I’m not seeing the map update at all, so you haven’t moved enough for this package to have triggered an update. This looks like an issue with your base’s odometry.
The other thing I’m thinking about is whether the YD lidar correctly implements the laser scan message. Off the top of my head I dont know the expected rotation direction of lidars for the message. If you’re seeing forward and back fine, well that could appear correct if your entire map is actually mirrored.
Is that map correct for the space you’re in or is it mirrored? Can you have it rotate CCW?
I’ve had this running on ~6 different robots and 3 different lasers in both orientations, I’m fairly certain this is another issue with your YD lidar driver ;-)
Another case is if your laser is not actually located where the URDF claims it is. That transform is critical
Please take a bag file and make sure to include tf and tf_static.
Side note: laser_frame: laser_link
doesn't do anything. That's a vestigial parameter from a since-removed need
Between all of your movements, I’m not seeing the map update at all, so you haven’t moved enough for this package to have triggered an update.
How do you define a map update ? I thought building up a wall or turning unknown space into free space is a "map update".
Regarding the odometry: the engine controller which is creating the /odom topic and the tf transformation odom
to base_link
is also home made. Of course, errors can still be found in this firmware. But when using gmapping the same odometry seems not to have such a worse impact to the mapping result.
Regarding the lidar: I think the rotation direction is not configurable - for that I have to mount the lidar upside-down (which is currently not so easy). But what is the influence of the spinning direction ? As said the /scan topic seems to have the objects in the right distance and orientation if displayed in RViz.
My URDF may not be accurate enough (+- 5mm) ...
Nevertheless I'll prepare a bag file with the current setup ...
5mm is a little big, but the errors you're seeing probably aren't solely from 5mm. I've duck-taped lidars on with that error and been fine (relatively).
The scanning direction matters alot. That message starts from angle N
and ends at M
but if N->M is CW and its expecting CCW that would do weird things. What about the mirroring thing?
Looking at the driver you send, there's a reversion
parameter, though I have no idea what that does.
OK- I'll have a deeper look into the driver ... and here is a bag file: mapping.zip
This is your laser I believe, look at this in odometric frame:
If you notice, in the forward X direction every time you turn, it looks like the walls move alot like there's some disconnect in the continuous sensing. That amount where the wall "shifts" in the map I see generated is proportional to the angle that as you're rotating your data is garbage in, it moves by seemingly 15 degrees(!!). That could be from a number of things: the laser data is just bad, scan_time: 0.117399998009
is wrong, that's very sensitive, the angle increments coming back don't actually line up with your URDF (if your laser is rotated, that's a huuuge problem, off by XYZ not so bad, RPY is)
I looked at the data in laser frame, yeah its not your URDF, its the laser data / message. Do you have an IMU?
yes - also a home made device ;-) But it's not mounted so far ...
It seems to work fine for about half the bag, and then you turn fast and it just gets messed up. I can see when the map updates its really trying to throw out the garbage scans (you can see it appear for a cycle, then disappear, then reappear, etc).
I'm not totally sure what to suggest on this front. It appears your sensor measurements are not invariant to the rotation of the sensor, or its improperly written drivers. Its almost as if its not continuously spinning and buffering measurements but swiping back and forth around the 0 degree mark. Or its not rotating at a fixed speed and the offsets you see are where the laser isn't correctly aligned with its absolute encoders.
Its also a $100 lidar so not sure what to expect... but I've done this with a RPlidar which is essentially the same thing so I'm surprised this is an issue. When I've seen issues like this in the past its usually an issue with sensor fusion of the odometry or the sensor not timestamping correctly
yes, I agree - this cheap lidar is not best device (but the budget is limited and doesn't include a Sick or Omron lidar unfortunately ;-))
Regarding the scan time: although I'm working with lidars quite a while I'm not aware of the exact definitions of all /scan fields. From the documentation sensor_msg/LaserScan
float32 time_increment # time between measurements [seconds] - if your scanner
# is moving, this will be used in interpolating position
# of 3d points
float32 scan_time # time between scans [seconds]
For me the fields have the following meaning (and they are also be filled by the driver in this meaning) time_increment: time between each reading in the ranges array scan_time: total time duration of the scan (=number of beams * time_increment)
Is this correct ? "time between scans" is not unambiguous.
just an other idea: the bag was recorded on my desktop and not on the robot (due to limited disk capacity of the raspi). Could this possibly lead to the scan effects you noticed ?
and once more: my engine controller is updating the odom topic and tf transformation only at 10 Hz - is that enough ?
from tf_monitor:
RESULTS: for odom to laser_link
Chain is: odom -> base_link -> ground_plate -> laser_link
Net delay avg = 0.0798069: max = 0.15347
On your definitions: yes what you said is correct.
On the bag file: .... potentially? Worth a look but I wouldn't put too much on it.
10hz... given that your lidar is publishing at 10hz, that seems slow. Try 30hz. Now that could do it
Tried a quick fix in my controller firmware and set the /odom and tf update rate to round about 35 Hz.
During the first mapping I thought that this change makes the mapping result better. It runs quite good for a while but suddenly (maybe after a strong turn) the map starts to degenerate due to misalignment. Stupidly, the cable connection has broken while driving and the ros bag get lost ;-(((
After restart I was not able to get the same good results - strange. Have to check it tomorrow ...
I'd also try running this in simulation with your URDF and make sure its working as a process of elimination
Today I was busy with the lidar driver. I tried to understand how the scan topic is displayed in RViz while the robot is moving. To eliminate all other error sources I only looked at /scan in it's own frame.
From my understanding it's clear that the shapes of the objects in the scan (e.g. a flat wall) must be somehow distorted due to the robot movement because the lidar driver does not know anything about it's movement in the environment.
This is the scan at rest (here is a small gap in the wall which is not a scan distortion):
Turning the robot at ~1,7 rad/s there is a kink: (For the scan duration of about 120ms this rotation speed is equal to about 12 degrees which is in the range of the visible kink angle)
one scan topic later there is also a step:
It seems that this step/kink is the point where the scan starts and begins (because it's a 360 degree lidar). This point seems to be stable (at least the bag file with a solely rotation which I'm currently lookin on). But I can't see a significant "swiping back and forth around the 0 degree mark"
as you described.
This distortion is systematic and can only be minimized with higher scan rates like it is done in more expensive lidars.
Do you agree or am I wrong ?
I agree, but it should be able to be resolved. This is an issue with poorly developed software/firmware not intrinsic about a cheapo lidar. Its a timing issue that they're seemingly just not addressing
I'm confused: but to solve this issue the driver / firmware must know the robot's velocities. Correct me if I'm wrong but I don't think this is implemented in the firmware of other high price lidars ...
The other way round: if there is a possibility to lower the scan rate of a well working high price lidar it would be interested to see how its scan topic behaves ...
The lidar doesn't need to know anything about the robot velocities if the timestamps on the laser data are correct.
If the robot is moving at a random high speed, a laser scan cannot be approximated as a static snapshot of the world with no aliasing. The measurements at 10 deg and 60 deg could be meters apart. However that is OK if the times that the measurements were recorded are correct, since you can back out that my timestamp for the message is N
and point 74 was taken M
cycles into it, and you can back out the approximate time that measurement was taken and transform the point into that frame at that time. That's what the laser projector inside of RVIZ is doing, takes the scan, projects each into a pointcloud by the appropriate time, and display it. If RVIZ is incorrectly displaying the measurements in laser frame, its because the laser projector isn't able to compensate because something in that isn't quite right. Or the data is total trash. One or the other :)
Sorry, but my confusion rises more and more. I think I need a coaching like "laser scanner for dummies" ;-))
How can I compensate the scan data in the one and only laser frame? A lidar can only give me an angle, distance and a time stamp for each point.
If the laser projector in RViz should compensate a movement of the laser frame than it needs the knowledge about that laser frame movement or am I on the wrong track ???? (If I provide only the /scan topic to RViz and nothing else than a compensation should not be possible)
the transform is map->odom->base->laser, many of those transforms aren't static. So those transforms are stamped at a time. map->odom at t=1 and t=1.1 are different, if you request them.
With that said this package doesn't do that. But RVIZ does and if RVIZ is displaying incorrectly, that's a big red flag that there's alot of things wrong here that this may be affected by
Yes I agree: if I have those transformation (especially odom->base) than it's possible by RViz to compensate the scan topic because the movement of the laser frame is known by tf.
But in this case (without tf)
(If I provide only the /scan topic to RViz and nothing else than a compensation should not be possible)
it should not be possible ...
You are correct. If you are in laser frame and the laser data is doing that, that's the raw aliasing. If you're in odometry frame and its doing it, then there's some timing/encoder issue such that the aliasing isn't being removed.
Off hand I haven't done that low level of testing with my RPlidar to know if that doesn't also have the same issue. I'm lucky enough to have a SICK laser at home
Ok - than I understood it too. And thanks for your patience. ;-)
Because the "raw" laser data says nothing from where the issue in the compete system comes from:
I have to rethink about how to solve the riddle
I'm lucky enough to have a SICK laser at home
unfortunately out of range ... ;-)
A suggestion, though annoying:
That might be able to cancel out a fair amount of it
To some amount I don't necessarily know that'll fix your issue. I'm a little surprised you have this issue considering a SICK is only 15hz, and yours is 10hz, if it was pure aliasing, really I should see something similar to it too. That's not some massive difference. I have a feeling like its not reporting data in the correct angle and what you're seeing when you turn is those artifacts.
You have the -1.6-1.6 range, where did that come from? Is there some culling of points maybe that's wrong there?
The mounting position of the lidar is not really optimal, the +/- 1.6 range is the "free" field of view in front of the robot. The rest is blocked by the casing. Maybe for a test I can mount the lidar on top to have full 360°.
I wonder if you request from the lidar certain ranges, it does weird things.
We're more than welcome to continue this discussion, but if you're not working on it anymore and its not directly related to this project, I'm going to close the ticket.
If you get back on it, we can reopen. I'm now also curious to get to the bottom of it. I might have to buy myself a YD lidar too...
Hi! I'm running into trouble during the mapping mode: als long as the robot is moving straight forward or backward the mapping is fine but as soon as I add a rotation it seems that the localization gets more and more lost and therefore the mapping process will generate unusable maps.
I assume that I'm using a improper configuration and need some help ...
Here some information about the system:
diff drive robot
low cost YDLIDAR X4
tf tree:
scans seems to be valid related to distance and angle in relation to laser_link:
using latest version of slam_toolbox (6281c8c04268d65386a2ba6f3cb1e7e8ff968555)
roslaunch slam_toolbox online_sync.launch