IntelRealSense / realsense-ros

ROS Wrapper for Intel(R) RealSense(TM) Cameras
http://wiki.ros.org/RealSense
Apache License 2.0
2.5k stars 1.74k forks source link

uvc streamer watchdog triggered on endpoint #2191

Closed Zkjoker closed 2 years ago

Zkjoker commented 2 years ago

My camera is D435i, and can't use filters:=pointcloud. If I use: roslaunch realsense_camera2 rs_camera.launch, it will be ok , and every thing goes well, no matter in the laptop or Tx2. But when I use: roslaunch realsense_camera2 rs_camera.launch filters:=pointcloud, in my laptop, I can see the pointcloud in the rviz. However, in my TX2, it raise an error, "uvc streamer watchdog triggered on endpoint". I have tried many solutions in the issues, but the error never changed. (for example changed the fps to 15) I wonder if you have identified the problem and what solutions are available??

MartyG-RealSense commented 2 years ago

Hi @Zkjoker Could you check whether you have the ability to change Color Transformer options in RViz on your Jetson TX2 when the pointcloud is enabled, as described in https://github.com/IntelRealSense/realsense-ros/issues/1967#issuecomment-873375954

Zkjoker commented 2 years ago

No,there is nothing. In fact, rostopic echo /camera/depth/color/points can't output anything.

I have used the method: roslaunch realsense2_camera rs_camera.launch filters:=pointcloud depth_width:=640 depth_height:=480 depth_fps:=15 , in this way I can get the image and pointcloud, however, there is such a big delay that it can't be used at all. The program is still not working properly.

And I found another method: roslaunch realsense2_camera rs_camera.launch filters:=pointcloud pointcloud_texture_stream:=RS2_STREAM_ANY , in this way , I can get a white pointcloud without any delay in rviz, but in this way I can't get image any more.

I found that many people have this problem in the issues. Have you ever bought an ARM computer such as TX2 and Xavier for testing? It seems to be a bug, and not because of the computing performance of the platform.

Zkjoker commented 2 years ago

@MartyG-RealSense

MartyG-RealSense commented 2 years ago

There is another case at https://github.com/IntelRealSense/realsense-ros/issues/2087 regarding a RealSense ROS user with a Jetson TX2 who had problems only when the pointcloud filter was enabled.

In that discussion, the user found that their performance improved if they used a roslaunch instruction with a custom stream configuration that was suggested to them in https://github.com/IntelRealSense/realsense-ros/issues/2087#issuecomment-933667462 (though the FPS could still drop if the CPU became busy).

In addition to TX2, there have been a small number of issues reported for other Jetson models such as Xavier when generating pointclouds in the ROS wrapper. It is not clear why this phenomenon occurs on Jetson board models.

Could you also try adding initial_reset:=true to your roslaunch instruction please to reset the camera at launch in order to see whether it has a positive effect on the performance of your TX2's pointcloud generation.

Zkjoker commented 2 years ago

@MartyG-RealSense Thank you for your relpy. I have searched the problem in the issues, and tried different method. initial_reset:= true doesn't have positive effect. I have tried : roslaunch realsense2_camera rs_camera.launch filters:=pointcloud depth_width:=640 depth_height:=480 depth_fps:=15 it can work but the delay is too long.

In #2078 , the command is: roslaunch realsense2_camera rs_camera.launch filters:=pointcloud depth_width:=848 depth_height:=480 depth_fps:=15 color_width:=1280 color_height:=720 color_fps:=15 but his camera is D455, my camera is D435i, could you please tell me which number of depth_width/ height, color_width/height should i use? Thank you!

In fact I also don't want to use the Jetson model, many programs that work on AMD computers do not work here. But for robot, it is a common host computer option. I still hope you can pay attention to it because if the ROS driver cannot be used on the robot, this driver will become meaningless.

Really thank you for your reply!

MartyG-RealSense commented 2 years ago

848x480 depth and 1280x720 color will work with D455 too, as the D455 is like an improved D435i. The main difference between the two models in regard to FPS is that the minimum FPS on D455 is '5' instead of '6'.

If you would prefer not to use Jetson for robotics then there are other options for robotics applications. A RealSense-compatible example of a board suitable for robotics is the Up Board.

https://up-shop.org/boards-modules/boards-modules.html

Intel also produce a range of computing equipment suitable for industrial applications called NUC that is available in a wide range of low to high end configurations and price points and form factors (board, partially completed kit and ready to run PC) and are small enough to fit onto a mobile robot.

https://www.intel.com/content/www/us/en/products/details/nuc.html

MartyG-RealSense commented 2 years ago

Hi @Zkjoker Do you require further assistance with this case, please? Thanks!

Zkjoker commented 2 years ago

of course, I still hope you can find a real way to solve the problem in tx2 or Xavier.

MartyG-RealSense commented 2 years ago

A RealSense ROS user at https://github.com/IntelRealSense/realsense-ros/issues/1964 who had a Jetson Xavier AGX had very high CPU usage when the pointcloud was enabled. They reduced CPU usage significantly by applying a Decimation Filter, at the cost of a reduction in resolution.

MartyG-RealSense commented 2 years ago

Hi @Zkjoker Do you have an update about this case that you can provide, please? Was the comment above about applying a Decimation Filter helpful? Thanks!

Zkjoker commented 2 years ago

Recently, I can't debug due to my business trip. I'm very sorry. I can't tell you whether this method is effective.

MartyG-RealSense commented 2 years ago

Okay, that is fine. Please do update when you are able to do debugging work again. Thanks!

MartyG-RealSense commented 2 years ago

Case closed due to no further comments received.