NVIDIA-ISAAC-ROS / isaac_ros_visual_slam

Visual SLAM/odometry package based on NVIDIA-accelerated cuVSLAM
https://developer.nvidia.com/isaac-ros-gems
Apache License 2.0
819 stars 127 forks source link

Drift with Intel Realsense D455 #67

Open oscarpang opened 1 year ago

oscarpang commented 1 year ago

Hi there,

I'm trying the Visual SLAM ROS package with Intel Realsense D455 in an outdoor environment. I found a rather big drift in the SLAM output. I recorded the infrared camera images/camera info in ROS2 bags and play it back to the Issac visual slam. I compare the SLAM result with colmap, and it shows about 1.2m error in 10 seconds. I also check the scale of the resulting landmark point cloud. The scale is about 0.7x-0.8x smaller.

I suspect that I did something wrong. Please kindly check my launch file. Any suggestion will be much appreciated.

Please see the launch file.

`

import launch from launch_ros.actions import ComposableNodeContainer, Node from launch_ros.descriptions import ComposableNode

def generate_launch_description():

visual_slam_node = ComposableNode(
    name='visual_slam_node',
    package='isaac_ros_visual_slam',
    plugin='isaac_ros::visual_slam::VisualSlamNode',
    parameters=[{
                'enable_rectified_pose': True,
                'denoise_input_images': True,
                'rectified_images': False,
                'enable_debug_mode': False,
                'debug_dump_path': '/tmp/elbrus',
                'enable_slam_visualization': True,
                'enable_landmarks_view': True,
                'enable_observations_view': True,
                'enable_localization_n_mapping': True,
                'enable_imu': True,
                'map_frame': 'map',
                'odom_frame': 'odom',
                'base_frame': 'camera_link',
                'input_left_camera_frame': '',
                'input_right_camera_frame': '',
                'path_max_size': 10000,
                'msg_filter_queue_size': 10000,
                'input_imu_frame' : 'camera_imu_frame',
                }],
    remappings=[('stereo_camera/left/image', 'camera/infra1/image_rect_raw'),
                ('stereo_camera/left/camera_info', 'camera/infra1/camera_info'),
                ('visual_slam/imu', '/camera/imu'),
                ('stereo_camera/right/image', 'camera/infra2/image_rect_raw'),
                ('stereo_camera/right/camera_info', 'camera/infra2/camera_info')]
)

visual_slam_launch_container = ComposableNodeContainer(
    name='visual_slam_launch_container',
    namespace='',
    package='rclcpp_components',
    executable='component_container',
    composable_node_descriptions=[
        visual_slam_node
    ],
    output='screen'
)

return launch.LaunchDescription([visual_slam_launch_container])

`

gordongrigor commented 1 year ago

With mention of IR images it suggests the emitter on the Realsense camera is enabled. This needs to be disabled as the Realsense IR emitter can cause issues for VSLAM as it creates a visual pattern in the left & right camera images that moves with the camera in the scene.

oscarpang commented 1 year ago

Yeh. I disabled the emitter entirely when I record the bag. And since it's an outdoor environment, the emitter pattern is not visible either.

hemalshahNV commented 1 year ago

We've pushed out changes to isaac_ros_visual_slam_realsense.launch.py with updates for changes in Intel Realsense parameters. Could you try running with this launch file and let us know if you're still seeing large drift?

oscarpang commented 1 year ago

Thanks for the updated launch file. I think I had most of the parameters in your updated launch file, except the frame rate and resolution. I was using 848x480x60. Is the 90 fps a requirement to get reasonable drift? I recorded the rosbag2 with 60 fps. Higher frame rate would introduce frame drops in recording.

Also, I tried VINS-Fusion with calibrated parameters. It seems that the calibration suggests that there's a time diff between the infrared cameras and the IMU. After I input the time diff, VINS-Fusion seems to work properly. Is there a way to input such time diff in the isaac_ros_visual_slam framework?

Thanks