Closed JoeyLeeNPU closed 5 years ago
It depends at which level you want to integrate the lidar. It can be used for odometry and/or SLAM. For rtabmap node, subscribe_scan_cloud
should be set to true
and scan_cloud
topic should be remapped. If you subscribe to a camera at the same time, it is recommended to use rgbd_sync
nodelet to pre-sync camera messages before rtabmap node. The default Icp
parameters of rtabmap assume a 2D laser scan, they would need to be adjusted for 3D point clouds. An example launch file for 3D Lidar odometry and rtabmap can be found here: https://github.com/introlab/rtabmap_ros/blob/master/launch/tests/test_ouster.launch
To add the camera to rtabmap node, set approx_sync
to true and subscribe_rgbd
to true. You will then need to remap the rgbd_image
input topic, coming from rgbd_sync
. A full simplified example could look like this:
<arg name="rgb_topic" default="/my_rgb_image_topic">
<arg name="depth_topic" default="/my_registered_depth_topic">
<arg name="camera_info_topic" default="/my_camera_info_topic">
<arg name="cloud_topic" default="/my_point_cloud_topic">
<arg name="frame_id" default="base_footprint">
<group ns="rtabmap">
<node pkg="nodelet" type="nodelet" name="rgbd_sync" args="standalone rtabmap_ros/rgbd_sync">
<remap from="rgb/image" to="$(arg rgb_topic)"/>
<remap from="depth/image" to="$(arg depth_topic)"/>
<remap from="rgb/camera_info" to="$(arg camera_info_topic)"/>
<param name="approx_sync" value="false"/>
</node>
<node pkg="rtabmap_ros" type="rtabmap" name="rtabmap" output="screen" args="-d">
<param name="frame_id" type="string" value="$(arg frame_id)"/>
<param name="subscribe_depth" type="bool" value="false"/>
<param name="subscribe_rgb" type="bool" value="false"/>
<param name="subscribe_rgbd" type="bool" value="true"/>
<param name="subscribe_scan_cloud" type="bool" value="true"/>
<param name="approx_sync" type="bool" value="true"/>
<remap from="scan_cloud" to="$(arg cloud_topic)"/>
<remap from="rgbd_image" to="rgbd_image"/>
<remap from="odom" to="odom"/>
<!-- RTAB-Map's parameters -->
<param name="RGBD/NeighborLinkRefining" type="string" value="false"/> <!-- set true to refine odometry with lidar -->
<param name="RGBD/ProximityPathMaxNeighbors" type="string" value="1"/>
<param name="Reg/Strategy" type="string" value="1"/> <!-- ICP registration -->
<param name="Grid/CellSize" type="string" value="0.1"/>
<param name="Grid/RangeMax" type="string" value="20"/>
<param name="Grid/ClusterRadius" type="string" value="1"/>
<param name="Grid/GroundIsObstacle" type="string" value="true"/>
<!-- ICP parameters -->
<param name="Icp/VoxelSize" type="string" value="0.3"/>
<param name="Icp/PointToPlaneK" type="string" value="20"/>
<param name="Icp/PointToPlaneRadius" type="string" value="0"/>
<param name="Icp/PointToPlane" type="string" value="true"/>
<param name="Icp/Iterations" type="string" value="10"/>
<param name="Icp/Epsilon" type="string" value="0.001"/>
<param name="Icp/MaxTranslation" type="string" value="3"/>
<param name="Icp/MaxCorrespondenceDistance" type="string" value="1"/>
<param name="Icp/PM" type="string" value="true"/>
<param name="Icp/PMOutlierRatio" type="string" value="0.7"/>
<param name="Icp/CorrespondenceRatio" type="string" value="0.2"/>
</node>
</group>
Now I want to use rtab-map as following:
Thanks!
In this paper, using a lidar is indeed increasing SLAM accuracy, however it was tested only in 2D indoor. With 3D lidar, it will depend on your lidar (range, rays) and in which application you want to use it. For indoor ground robot, I guess it will give similar results than with 2D lidar results of the paper. In 3D, you will need to try it. But yes, loop closure will be done using image and lidar.
cheers, Mathieu
@matlabbe Hi Mathieu
I wanted to continue the discussion about examples for 3D lidar in this issue instead of making a new one, and was wondering if it is possible to combine the RGB data with the 3D lidar PointCloud2 data to create a colored pointcloud? As far as I understand the example you gave above as well as the test_ouster launch file (which I've tried with an actual Ouster OS1-64), the inclusion of RGB will only help in adding appearnce based loop closure detection right? Is there any option in RTABMAP to create a colored pointcloud by combining camera RGB and 3D lidar point clouds or do I need to look for 3rd party tools to do that, and then use that pointcloud for SLAM? Both rtabmap_ros/point_cloud_xyzrgb and rtabmap_ros/rgbd_sync are for more traditional rgb + depth images from cameras as far as I understand. While I could generate a depth image from the lidar point cloud (actually the Ouster already publishes a 64x1024 or 64x2048 pixel depth image), I am not sure about what to do with intrinsic calibration of that so it can be used alongside a separate RGB image.
Any input is very welcome and if you feel like the discussion doesn't belong here I can create a new issue, or (more appropriately) an question on ros answers / ros discourse?
If you are using a standard RGB camera (pinhole), you could project the ouster cloud in it with rtabmap_ros/pointcloud_to_depthimage nodelet. Then input that new RGB-D image to rtabmap like the example above. To use panoramic ouster images, it is not supported yet in rtabmap_ros.
@matlabbe In recent days, I want to feed 3D laser point cloud into the rtab-map for localization, but maybe I met some problems in writing launch file. When I only use camera as input, rtab-map will work well, but when I add laser point cloud, then it can not work. So can you provide a stardard launch file for using both laser and camera?
Thanks for help, Joey