ros-perception / laser_filters

Assorted filters designed to operate on 2D planar laser scanners, which use the sensor_msgs/LaserScan type.
BSD 3-Clause "New" or "Revised" License
161 stars 199 forks source link

Added LaserScanBinningFilter: places measurements into fixed number of bins. #192

Open agoeckner opened 2 months ago

agoeckner commented 2 months ago

Some software such as slam_toolbox requires that every LaserScan message have the exact same number of measurements. However, cheap LIDAR units (such as the TurtleBot3's LD08/LDS-02) are incapable of this precision and/or have poorly written drivers.

When using these sensors in combination with slam_toolbox, users will often see messages like the following:

LaserRangeScan contains 228 range readings, expected 230

For example, see this issue: https://github.com/ldrobotSensorTeam/ldlidar_stl_ros2/issues/11

See also:

This PR resolves that issue by allowing users to bin their laserscans such that every message has the same number of measurements.

No interpretation of the scan data is performed when multiple measurements fall into the same bin, and one of the values will overwrite the others. It is recommended to set the num_bins parameter high enough that this will not happen. For example, for the TurtleBot3's LD08/LDS-02 sensor, set num_bins to at least 360.

Tagging @SteveMacenski.

agoeckner commented 2 months ago

Good point, thank you Steve. I've updated that now. I'll test my changes once I get into the lab later today and will let you know if all still works as expected.

agoeckner commented 2 months ago

Hey @SteveMacenski, I've updated the code to respect the original scan arc. However, this causes some problems with slam_toolbox:

LaserRangeScan contains 359 range readings, expected 360

I believe that this is caused by the residual variable in https://github.com/SteveMacenski/slam_toolbox/blob/ros2/lib/karto_sdk/include/karto_sdk/Karto.h#L4299.

Are you sure that fixing the arc from 0-360 is a problem? Any values outside the original arc would be set to NaN, so I don't think it would cause any issues.

SteveMacenski commented 2 months ago

If it was working before your adjustment from 360 in this PR, then I think the issue lives in this PR with the update, no

BTW: I'm not a maintainer here, I can't merge or officially review, I just saw you tag me and I think this is valuable. Someone else will need to come in and review / merge

agoeckner commented 2 months ago

No, slam_toolbox does not consider the lidar sensor to be a 360-degree lidar because the LD08/LDS-02 and its drivers are such poor quality that max-min angle is never actually close enough to 360 degrees. Because of that, slam_toolbox expects another measurement (which will never exist) as seen in the code I linked above.

Everything worked as expected when I forced the min to 0 and max to 360, since slam_toolbox then considered the sensor to be a 360-degree lidar. Anything slightly off and slam_toolbox will expect that nonexistent extra measurement. There's some discussion of that here: https://github.com/SteveMacenski/slam_toolbox/issues/293#issuecomment-696296457

And all good, I just tagged you for advice since I figure that this is in your area of expertise!

jonbinney commented 2 months ago

I'm a bit worried that this filter is specific to slam_toolbox - are there any other packages that we can test this against?

SteveMacenski commented 2 months ago

That's a fair point, this is actually an internal Karto bit so Karto SLAM will also have this problem (though SLAM Toolbox has virtually superseded that, so not really a second independent datapoint).

I think regardless, it is valuable for the cheap-o lidars to have a way to feel more like the professional lidars that most of the perception and robotics systems assume. I can't point to a second application today that needs this, but this also fills a niche gap between simulation 2D 360 lidars and "real" ones that have that inconsistency which could have value.

That's how I think about this, but I defer entirely to your judgement.

jonbinney commented 2 months ago

True - its definitely useful. I don't entirely understand the back and forth here about 360 vs 359 bins. From the name, "binning" i would expect this to create evenly spaced bins, and 360 of them sounds like a reasonable default. Bin 0 would correspond to 0 to 1 degrees, etc etc up to bin 360 corresponding to 359 to 360 degrees. This seems like a nicely simple and general approach, although I don't claim to understand the pros and cons w.r.t. slam performance.

The other option (which I think is closer to what it is doing now) is to keep angle_increment the same, and pad it with NaNs. But there's no guarantee that angle_increment divides cleanly into 2*pi, so wouldn't there be an irregular angle_increment between the end of one scan and the beginning of the next? Also if this is the intended behavior, "binning_filter" seems like the wrong name - it feels more like a "pad-with-nans" filter.

@agoeckner could you describe in words what the intended function of the filter is? (not in relation to karto; just a general description like i've given in the first two paragraphs). Once we can agree on the desired functionality, I'll review the code carefully.

jonbinney commented 2 months ago

(you did that in the description of the PR but the functionality seems to have changed a bit)

agoeckner commented 2 months ago

Hi @jonbinney, this code was originally written to enable the use of low-quality LIDAR sensors with the slam_toolbox package.

Originally, it placed all measurements into a fixed number of bins which were evenly spaced over a 360-degree arc.

I made revisions (which you see now) to address some of Steve's comments, but after making (and testing) these I find that the filter no longer addresses the original goal.

I will revert those changes shortly, so that the binning filter simply places all measurements into bins evenly-spaced around a 360-degree arc.

agoeckner commented 2 months ago

@jonbinney @SteveMacenski updated as described above. Let me know how it looks!

jonbinney commented 2 months ago

I started a review and then realized I don't know enough about Our assumptions. For the lasers and drivers this is designed to help with:

Just to make sure my memory is correct: for LaserScan messages, i would assume that ranges.size() ought to equal (angle_max - angle_min) / angle_increment + 1. That would mean that for a 360 degree scanner with 1 degree between each laser beam, each scan message the driver publishes should have angle_min==0 and angle_max==359 (not 360). @SteveMacenski @agoeckner is this true?

agoeckner commented 2 months ago

Good questions.

are angle_min, angle_max, and increment the same for every scan published by the driver?

No, these are all over the place. The scan arc is never the same between two messages (which is why slam_toolbox doesn't like this sensor).

does (angle_max - angle_min) / angle_increment always equal ranges.size() + 1?

It looks more like ceil((angle_max - angle_min) / angle_increment) == ranges.size() in the original scan.

are scan_time and/or time_increment set to useful values? (the current approach makes these a bit meaningless in the output scan)

Hmm, good point. They do appear to be valid in the original scan, with ceil(scan_time / time_increment) == ranges.size(). Do you think that I should adjust time_increment to achieve the same in the filtered scan?

Just to make sure my memory is correct: for LaserScan messages, i would assume that ranges.size() ought to equal (angle_max - angle_min) / angle_increment + 1. That would mean that for a 360 degree scanner with 1 degree between each laser beam, each scan message the driver publishes should have angle_min==0 and angle_max==359 (not 360).

You're probably right if it's a decent sensor! My LD08/LDS-02 though never provides scan messages with consistent angles. 🙃

SteveMacenski commented 2 months ago

are angle_min, angle_max, and increment the same for every scan published by the driver?

Generally speaking: yes. But for these really cheap lidars, the motor might mess up the speed for a bit, or something bumps it hard, etc which impacts the number of measurements. I don't think that the actual data of the increment is updated in any of their drivers to account for this, but the data can be slightly different, +/- 1-2 data points which are usually just omitted.

are scan_time and/or time_increment set to useful values? (the current approach makes these a bit meaningless in the output scan)

I think the same case. For professional lidars, typically yes (but not always). For the hobby/consumer grade ones I think that's a case by case basis. Though the scan time I think shouldn't be affected by this operation. The time_increment .. perhaps should be adjusted? I suppose if the lidar takes 1 too few or 1 too many measurements, that means that either (A) the velocity of the rotation is off so that time increment is off or (B) the onboard microprocessor was underpowered and missed recording measurements at a particular encoder signal but the speed of rotation was correct so the time increment is correct (just data incomplete)

A is what I assumed what was happening in this case, but I suppose I haven't gone super deep into it before to say with absolute certainty there isn't B for the RPLidar A1 or similar.

jonbinney commented 2 months ago

My worry about time_increment is that there could be some tricky corner cases:

I think I understand the scope of the problem with the input scans now. Now a couple questions about what karto/slam_toolbox requires:

SteveMacenski commented 2 months ago

I'll leave the first questions to @agoeckner

does angle_min need to be the same in every scan? does it use time_increment to adjust the position of points in the scan? Or does it assume the entire scan takes place at exactly msg.timestamp?

Karto assumes a professional grade lidar that produces consistent measurements at regular intervals. It doesn't consider message-to-message discrepancies. It takes in the metadata it requires when we first register the lidar sensor (resolution, range, name, etc) and from that point on data is stored as a vector of the range values which are interpreted by various subsystems as it needs to match scans or generate an occupancy grid.

So, does the angle min/max need to be the same ... not technically but it sure does cause a problem if there's an variation in it because Karto assumes a consistent measurement source. These cheapo lidars didn't exist when that was developed and I don't think any SLAM that I've read the source code of really treats each individual measurement as having potentially different internal metadata when coming from a unique source. The reason SLAM Toolbox / Karto is the source of problem for those lidar users is that we check the input data for consistency before processing and log errors when its wrong (gmapping / cartographer do not). So they also have their results suffer from having some measurements that are misaligned, the end user just isn't aware of it. My project just rejects those updates explicitly rather than not being aware that the data isn't as it should be from the processor's perspective.

So circling back to your question: yes, really all the data should be the same always for a well made laser scan sensor & is assumed to be the case for all SLAM and localization pipelines that are openly available in the ROS community that I'm aware of. I just surveyed Karto, Cartographer, gmapping, and AMCL. I don't have reason to think any others would be any different.

At least Karto/SLAM Toolbox doesn't care much about time_increment - we assume it happens at timestamp and don't attempt to do any dealiasing. but, I'd expect any filter in this package to properly have that updated for those that do require it. I know there are TF tools that will use that to (realitive speaking) complexly transform the laser scans into PC2 using that to dealias and we do actually expose that in 1-2 places in Nav2 where that subtlety can be important.

agoeckner commented 2 months ago

Hey Jon, I think that Steve summed it up pretty well. Sounds like the only remaining task is to fix the time_increment, taking into account the corner cases as you mentioned.

jonbinney commented 2 months ago

Sounds good - I'm not sure there's a perfect solution to all corner cases and I suspect there are more of them than the ones I listed. Want to update the PR with what you think is best and then we can discuss?

agoeckner commented 1 month ago

Sounds good Jon. I'll update soon.

Any chance that you or @SteveMacenski will be at Automate today in Chicago?

jonbinney commented 1 month ago

Great! Unfortunately i'm not at automate - would be fun to meet up!