Closed billkotsias closed 3 months ago
You can feed in your undistorted scan into slam toolbox using a scan undistortion node between the raw sensor data and the slam method
@SteveMacenski I'm a newbie to SLMT... Could you please point me to some guide/code on how to insert such a node between raw data and SLAM method?
Is there a point in code where raw LIDAR rays are converted to 2D points, so that I can add my filter there?
Thank you for taking the time! I'm afraid it won't be that simple though because - like I said in original post :
There are already motion compensation filters, however they produce a PointCloud. The default scan message that slam_toolbox expects is very limited, as it can only represent rays that have the same angle between them, which is not the case when doing motion compensation.
In other words, motion compensator filter produces points that no longer lie on rays of equidistant angle.
So I guess I will need to adapt the SLMT lidar message interpreter.
Feature description
When robot is moving during mapping, LIDAR readings produce a distorted view of its surroundings. AFAICS, slam_toolbox compensates for all scan poses but doesn't do motion compensation within a LIDAR scan.
Implementation considerations
There are already motion compensation filters, however they produce a
PointCloud
. The default scan message that slam_toolbox expects is very limited, as it can only represent rays that have the same angle between them, which is not the case when doing motion compensation.IMHO it would be very robust to be able to feed slam_toolbox with a pre-compensated LIDAR PointCloud instead of
LaserScan
.