IntelRealSense / realsense-ros

ROS Wrapper for Intel(R) RealSense(TM) Cameras
http://wiki.ros.org/RealSense
Apache License 2.0
2.42k stars 1.71k forks source link

Error with "roslaunch realsense2_camera opensource_tracking.launch" #3123

Open Shrioe opened 2 weeks ago

Shrioe commented 2 weeks ago

Hi,

I am new to VSLAM. I am following the instructions of "SLAM with D435i tutorial" roslaunch realsense2_camera opensource_tracking.launch,

I encountered some errors as shown below:

ERROR] [1718263335.611787184]: Critical Error, NaNs were detected in the output state of the filter. This was likely due to poorly conditioned process, noise, or sensor covariances. [ WARN] [1718263335.612158811]: Failed to meet update rate! Took 0.0086891700000000012677 Error:   TF_NAN_INPUT: Ignoring transform for child_frame_id "camera_link" from authority "unknown_publisher" because of a nan value in the transform (nan nan nan) (-nan nan -nan nan)          at line 240 in /tmp/binarydeb/ros-noetic-tf2-0.7.7/src/buffer_core.cpp Error:   TF_DENORMALIZED_QUATERNION: Ignoring transform for child_frame_id "camera_link" from authority "unknown_publisher" because of an invalid quaternion in the transform (-nan nan -nan nan)          at line 255 in /tmp/binarydeb/ros-noetic-tf2-0.7.7/src/buffer_core.cpp

And I also faced issue when trying to replay the rosbag file after running the line: "roslaunch realsense2_camera opensource_tracking.launch offline:=true" , whereby it will only open a new rviz file without showing anything. May I ask what am I supposed to get at this step? 20240613_152020

Could you give me some suggestions?

Thx!

MartyG-RealSense commented 2 weeks ago

Hi @Shrioe After the D435i SLAM guide was published, the gyro and accel IMU topics were later set to disabled by default in the RealSense ROS wrapper. Please try enabling the topics using the launch instruction below to see whether it makes a difference to the errors that you are receiving.

roslaunch realsense2_camera opensource_tracking.launch enable_accel:=true enable_gyro:=true unite_imu_method:=linear_interpolation

Shrioe commented 2 weeks ago

Hello,

Thank you for your reply! Just to let you know, I'm using a Raspberry Pi 4B with Ubuntu 20.04.

For ypur information after implementing the code you mentioned above the errors are still there, and the rviz will close by itself automatically after launching for few seconds.

Could you please provide some suggestions?

Thank you so much!

MartyG-RealSense commented 2 weeks ago

At https://github.com/IntelRealSense/realsense-ros/issues/2494 another RealSense ROS user with Ubuntu 20.04 experienced Critical Error, NaNs were detected in the output state of the filter on a PC, so the cause may not be that you are using a Pi 4 board. SLAM in general is a processing-intensive activity, so this error could occur even on a full-size computer.

Looking at the full list of errors, most of them look like what I woud call 'cascade errors' caused by the first NaN error. In other words, if the initial 'Critical Error: NaNs were detected' error had not occurred then the subsequent errors that refer to NaN likely would not occur.

The 'Failed to meet update rate!' warning might also be occurring because of the Critical Error.

Does stability improve if you use unite_imu_method:=copy in your launch instruction instead of setting it to linear_interpolation.

As you are using a Pi 4b, that could also be a cause of the errors because RealSense cameras usually have problems with accessing IMU data when used with Raspberry Pi boards.

Shrioe commented 2 weeks ago

Hello @MartyG-RealSense , by referring to #2494, i tried to edit the rs_camera.launch file as shown below: 20240615_155139

But when I run "roslaunch realsense2_camera opensource_tracking.launch enable_accel:=true enable_gyro:=true unite_imu_method:=linear_interpolation", there is no more NaN values errors but still: " 15/06 14:26:33,788 WARNING [281472450236784] (messenger-libusb.cpp:42) control_transfer returned error, index: 300, error: Resource temporarily unavailable, number: b [ WARN] [1718432786.188037476]: Failed to meet update rate! Took 0.0067153130000000005317 15/06 14:26:33,890 WARNING [281472450236784] (messenger-libusb.cpp:42) control_transfer returned error, index: 300, error: Resource temporarily unavailable, number: b

In rviz,

There is no map shown. I cannot tick the "map" or "camera_link" because under "frame", there is only "All enabled". Then, under "image", there is no image shown even though i changed to "cameracolor/image_raw"

20240615_142133

MartyG-RealSense commented 2 weeks ago

There is a past case of the opensource_tracking.launch file at the link below that experienced the same warnings that you had. In their particular case the problem was apparently caused by the USB port that they were using the camera on.

https://support.intelrealsense.com/hc/en-us/community/posts/8338902895379-D435I-IMU-ISSUE

For another user at https://github.com/IntelRealSense/realsense-ros/issues/2386#issuecomment-1179073597 they found that the problem was caused by having installed librealsense twice by different methods.

Shrioe commented 2 weeks ago

I using the cable given while I bought the d435i, and I have check the USB device details: 20240615_165628

After checking, I found that the camera will only has issue (no showing image), when I change code in rs_camera.launch to "arg name ="Enable gyro" value="true" arg name ="Enable accel" value="true" "

Shrioe commented 2 weeks ago

I have tried solutions from #2494, #2386, and https://support.intelrealsense.com/hc/en-us/community/posts/8338902895379-D435I-IMU-ISSUE, already, since the image cannot appear if I modify enable_gyro from false to true, I only adjusted the fps value only, and when I was trying to run "roslaunch realsense2_camera opensource_tracking.launch enable_accel:=true enable_gyro:=true unite_imu_method:=copy.", it looks like everything revert back tk my initial problems which is the "NaN error", and now the rviz will close automatically by itself after it pop out for few seconds only.

But if I run "roslaunch realsense2_camera opensource_tracking.launch enable_accel:=true enable_gyro:=true unite_imu_method:interpolation _linear", then rviz will appear, just that there's still no map shown . It will only form the map of the first image it captures, after that eventhough I move around the camera, the map won't change. In this case how can I record the rosbag to form the map, or is there any other possible solution, as long as I can form a map of surrounding, if its 2D map, or point cloud is OK for me.

What if I forget about the imu features, just form the without imu data, is it possible?

*For your information, I have these info: " Realsense Ros v2.3.2 Built with librealsense v2.50.0 Running with librealsense v2.50.0 Device usb type: 3.2"

MartyG-RealSense commented 2 weeks ago

The unite_imu_method command is unite_imu_method:=linear_interpolation

The D435i SLAM guide is very old and its component installation instructions are for ROS1 Kinetic from around the year 2016. For newer ROS1 versions such as Noetic I recommend adjusting the installation instructions to use the newer Noetic versions of the components if you have not done so already.

mu_filter_madgwick: sudo apt-get install ros-noetic-imu-filter-madgwick rtabmap_ros: sudo apt-get install ros-noetic-rtabmap-ros robot_localization: sudo apt-get install ros-noetic-robot-localization


An alternative to the D435i SLAM guide that does not need an IMU may be ORB_SLAM2_ROS, which is compatible with ROS1 and the IMU-less D435 camera model.

https://github.com/appliedAI-Initiative/orb_slam_2_ros

There is also a newer version called ORB_SLAM3 that has been tested with Noetic.

https://github.com/thien94/orb_slam3_ros

Shrioe commented 2 weeks ago

I'm currently using ros noetic and I installed mu_filter_madgwick: sudo apt-get install ros-noetic-imu-filter-madgwick rtabmap_ros: sudo apt-get install ros-noetic-rtabmap-ros robot_localization: sudo apt-get install ros-noetic-robot-localization Too from the beginning.

Actually I was previously working on orbslam3 too for few months, but I always failed to build the orbslam3 folder after many trials, but I still can't figure out the possible reasons, that's why I switch to the website SLAM with d435i.

What if I reinstall evrything such as the ubuntu 16 instead of ubuntu 20.04? In order to use this rtabmap, does it work? If I use ubuntu 16 I cannot use Raspberrypi5 is it?

How about this:" In this case how can I record the rosbag to form the map, or is there any other possible solution, as long as I can form a map of surrounding, if its 2D map, or point cloud is OK for me."

MartyG-RealSense commented 2 weeks ago

I would not recommend going back to Ubuntu 16.04 and Kinetic as the Kinetic branch of ROS1 is End of Life (retired).

Because of the problems with accessing the IMU of RealSense cameras on Raspberry Pi boards, any solution that uses gyro and accel data likely would not be suitable for Pi 4 or Pi 5.

Instead of using ROS, perhaps you could record a bag file of 2D data in the RealSense Viewer tool using its 'Record' button. You could then play back the bag in the Viewer and switch into the Viewer's 3D pointcloud mode to render the 2D bag data as a 3D pointcloud in real-time. The 3D pointcloud mode enables the pointcloud to be exported to a .ply pointcloud file that can then be loaded into 3D software for further editing, such as converting the pointcloud into a solid object.

If you know how to compile C++ programming code then you could also consider using the rs-kinfu tool, which progressively builds up a pointcloud as the camera is moved, similar to how the map in the D435i SLAM guide is generated. The pointcloud can then be exported to a .ply.

https://github.com/IntelRealSense/librealsense/tree/master/wrappers/opencv/kinfu

Shrioe commented 2 weeks ago

I see thank you so much for your clarification! May i ask is there any suitable or useful website I can refer to on how to use realsensen viewer to record and process 2d data into a 3d point cloud?

MartyG-RealSense commented 1 week ago

The bag recording process in the Viewer is simple.

  1. Enable the stream types that you want to record to bag file and then click on the Record button at the top of the Viewer's options side-panel to begin recording those streams.

image

  1. To stop recording and create the bag file, click the Record button again.

  2. To play back the bag file in the Viewer, you can drag and drop the bag file into the center panel of the Viewer to automatically start playback.

  3. Once playback has begun, click on the 3D option in the top corner of the Viewer window to switch to 3D pointcloud mode and automatically render the 2D data as a pointcloud.

image

  1. Click the Export button at the top center of the Viewer window during playback to export to a pointcloud .ply file.

image


To find the location on your computer that the bag file will be saved to, click on the gear-wheel icon at the top corner of the Viewer and select Settings from its drop-down menu to open the settings configuration interface.

image

When the settings interface pops up, click on the Playback and Record tab and look at the text beside 'Default recording folder'.

image


You might also find the link below's guide to exporting a 3D pointcloud from the Viewer's 3D mode to .ply file helpful.

https://www.andreasjakl.com/capturing-3d-point-cloud-intel-realsense-converting-mesh-meshlab/

Shrioe commented 1 week ago

Thank you so much for the guidance! I think I might still need to form a final map from the .ply file

  1. If I use the 3d tools such as Meshlab, is that mean I am no longer using vslam for mapping already? Because my project is to use vslam technique to perform mapping.

  2. In other way, do you mean after I get the .bag file using realsense-viewer, then I can proceed with the "roscore >/dev/null 2>&1 & rosparam set use_sim_time true rosbag play my_bagfile_1.bag --clock roslaunch realsense2_camera opensource_tracking.launch offline:=true"

And rosrun map_server map_saver map:=/rtabmap/proj_map –f my_map_1 to form the 2d map?

  1. Actually may I clarify again regarding what does this line of code do? "Roslaunch realsense2_camera opensource_trackinv.launch offline" Is it once I have a recorded rosbag, I need to run "rosbag play my_bagfile_1.bag --clock", then proceed with the line of code as mentioned above? But when I launch this line, it will only launch a rviz with nothing inside and without reconfiguration, there is also no option of "map" and "amera_link" in the display panel, no pointcloud, and also there's no image shown. I'm not sure what is supposed to happen when running this code, can you explain to me regarding that?

  2. And I have tried to record the video in realsense, and I get a .bag file now, what should I do next to get the final map? I'm expecting to get something like this: ![Uploading 20240616_155501.jpg…]()

MartyG-RealSense commented 1 week ago
  1. MeshLab is not a SLAM tool. If you need to do VSLAM and cannot use the IMU then stella-vslam would be worth looking at as it does not require an IMU.

https://github.com/stella-cv/stella_vslam

  1. The procedure from the D435i SLAM guide that you describe is correct. However, bag files recorded with RealSense SDK programs such as the RealSense Viewer have some differences to ROS rosbags and so may not be fully compatible with playback in ROS.

  2. When the opensource_tracking.launch file is launched, it imports settings from the rs_camera.launch file and configures some additional settings that enable the camera to be used with RTABMAP.

The D435i SLAM guide has not been updated since January 2019, so it cannot be guaranteed that it will work correctly in the present day.

When It is functioning correctly, it is meant to look like the rotating map image under the text and the resulting point cloud in the guide.

image

  1. Please paste your image into the comment box again as it did not display in your last comment.
Shrioe commented 1 week ago

Thank you fr your response!

  1. Is that mean I cannot proceed with the ros code in slam with d435i guide using the .bag file recorded in relsensenviewer and I must go through the recording using "roslaunch realsense2_camera opensource_tracking.launch"?

  2. I think the one i dont understand is about the code "roslaunch realsense2_camera opensource_tracking.launch offline:=true", which is run after replay the rosbag, normally even if I have done recorded a bag file using rviz, then after I replay the recorded rosbag and run this line of code: "roslaunch realsense2_camera opensource_tracking.launch offline:=true", I can't gt any output that allows me to proceed with and form the 2d map or pointcloud because I think in the slam with d435i it says: when the system is up, then only I cannot run the code to save the 2d map, therefore may I get some advices from you at this step? Regarding how can I convert my recorded .bag file (not from realsense-viewer) to the 2d map?

MartyG-RealSense commented 1 week ago
  1. I think whether a bag file recorded in the Viewer will play in ROS will depend on the contents of the messages stored in the bag. There is no harm in trying playback of the file in ROS to see what happens.

  2. I believe that you would need to include the IMU topic enabling commands in the offline launch in order for the bag's IMU data to be published.

roslaunch realsense2_camera opensource_tracking.launch enable_gyro:=true enable_accel:=true unite_imu_method:=linear_interpolation offline:=true

Shrioe commented 1 week ago
  1. Yes I have tried to run the .bag file I recorded through realsense viewer, and followed with the code:"roslaunch realsense2_camera opensource_tracking.launch offiline:=true." It's just that we go back to the same question as I mentioned above in number 3, even though i can replay the bag, but I don't know how to use this code to get the final map, whether I need to run any other extra code, or do any configuration and etc, I always blur at this step, maybe I tried tk record a video to you for demonstration https://youtu.be/PExElKPb_08?si=girdvFsoaoC1Us7Y

  2. Alright will try

Shrioe commented 1 week ago

Hello @MartyG-RealSense , any update regarding number 2?

MartyG-RealSense commented 1 week ago

If the D435i SLAM guide and ORB_SLAM do not work for you then the last available option with a chance of success may be the SLAM tool Kimera-VIO-ROS, which is compatible with Noetic.

https://github.com/MIT-SPARK/Kimera-VIO-ROS

Shrioe commented 1 week ago

Alright thank you for the suggestions, I'll try for that!

Anyway based on my previous comment, can you tell me whether I'm doing the replay of rosbag and forming the map in a correct way?

MartyG-RealSense commented 1 week ago

I did not see a problem in your replay procedure. However, because a RealSense SDK bag was being used instead of a ROS rosbag it may not succeed because of the differences between SDK bags and rosbags.

Shrioe commented 1 week ago

Actually the same thing happens even if I use the ROS rosbag too, whereby the rviz starts without configuration and no response even if the rosbag is running, therefore I'm not sure is it because I have missed some important steps or not.

Just to add on I have both -Ros-noetic-librealsense2 and -ros-noetic-realsense2-camera on my ubuntu, does this allowed when performing mapping using rtabmap?

MartyG-RealSense commented 1 week ago

I think that having ros-noetic-librealsense2 and ros-noetic-realsense2-camera on your computer means that the librealsense SDK and the ROS wrapper were both installed at the same time with the command below.

sudo apt-get install ros-$ROS_DISTRO-realsense2-camera

This installation method is called Method 1. Have you used this method, please?

Method 1 does not install realsense-viewer though. So if you used Method 1 and also have realsense-viewer on your computer, this would suggest that librealsense had been previously installed by a different method. In that situation there could be two different librealsense installations on the computer that are conflicting with each other.

Shrioe commented 1 week ago

-I followed the ros 1 legacy as suggested in the link: https://github.com/IntelRealSense/realsense-ros/tree/ros1-legacy

-And yes I have tried sudo apt-get install ros-$ROS_DISTRO-realsense2-camera.

-And yes I can also launch the realsense viewer on my ubuntu, is that mean I already have two redundant librealsense installation?

-I gt the below messages in the terminal:

pi@raspberrypi:-$ dpkg -1 | grep realsense

ii ros-noetic-librealsense2

2.50.0-1focal.20211115.135230

arm64 Library for capturing data from the Intel(R) RealSense(TM) S R300, D400, L500 Depth cameras and T2xx Tracking devices.

2.3.2-1focal.20240111.191354

ii ros-noetic-realsense2-camera arm64 RealSense Camera package allowing access to Intel T265 Track ing module and SR300 and D400 3D cameras

MartyG-RealSense commented 1 week ago

If you have realsense-viewer then yes, it is likely that you have (or have previously had) two librealsense installations on your computer that were installed by different methods.

A table of librealsense versions at https://github.com/IntelRealSense/realsense-ros/issues/3026#issuecomment-1972696034 that was produced by the same grep command that you used and has the same version numbers as you indicates to me that you have not got more than one librealsense version installed.

When you run realsense-viewer, which version number is it? The version number will be at the top of the viewer window.

Shrioe commented 1 week ago

For your information, I have these info: " Realsense Ros v2.3.2 Built with librealsense v2.50.0 Running with librealsense v2.50.0 Device usb type: 3.2"

Besides, I noticed from the comment https://github.com/IntelRealSense/realsense-ros/issues/3128#issuecomment-2170281414 Mentioning about the "roslaunch rtabmap_launch rtabmap.launch"

may I ask should I run this t code ogether with the roslaunch .. opensourec_tracking.launch in order to get the map? Can you give me more information about this, thank you for that!

MartyG-RealSense commented 1 week ago

Usually when using both the ROS wrapper launch file and RTABMAP launch file, rs_camera is launched first and then rtabmap.launch is launched secondly after the RealSense launch has completed.

  1. roslaunch realsense2_camera rs_camera.launch

  2. roslaunch rtabmap_launch rtabmap.launch

Shrioe commented 1 week ago

But how about the roslaunch realsense2_camera opensource_tracking.launch?

MartyG-RealSense commented 1 week ago

opensource_tracking.launch contains settings that enable the camera to be used with RTABMAP, so I guess that if rtabmap.launch is used with rs_camera.launch (which lacks those RTABMAP settings) then rtabmap.launch takes care of configuring those settings instead.

MartyG-RealSense commented 4 days ago

Hi @Shrioe Do you require further assistance with this case, please? Thanks!