IntelRealSense / realsense-ros

ROS Wrapper for Intel(R) RealSense(TM) Cameras
http://wiki.ros.org/RealSense
Apache License 2.0
2.59k stars 1.76k forks source link

No pointcloud2 data in rviz when RaspberryPI4 run rs_camera.launch #1285

Closed jacky1089 closed 4 years ago

jacky1089 commented 4 years ago

We are trying to get the Intel Realsense D435i to work on our RaspberryPi4 with the Raspbian OS and ROS Melodic. After we configured our Raspberry Pi with Raspbian and installed ROS Melodic on it,enable rs_camera.launch pointcloud. When we connect our Realsense camera to the Raspberry and run the following command:

$ roslaunch realsense2_camera rs_camera.launch

$rviz After we add pointcloud2 topic,there is no data in the grid area.

This function works well on a PC running Ubuntu 18

jacky1089 commented 4 years ago

All information displayed in the terminal is as follows:

$ roslaunch realsense2_camera rs_camera.launch

[ INFO] [1594725439.961364578]: Device with physical ID 2-1-2 was found. [ INFO] [1594725439.961664210]: Device with name Intel RealSense D435I was found. [ INFO] [1594725439.963363850]: Device with port number 2-1 was found.

[ INFO] [1594725440.071997301]: JSON file is not provided [ INFO] [1594725440.072265840]: ROS Node Namespace: camera [ INFO] [1594725440.072360988]: Device Name: Intel RealSense D435I [ INFO] [1594725440.072444989]: Device Serial No: 902512070700 [ INFO] [1594725440.072564582]: Device physical port: 2-1-2 [ INFO] [1594725440.072731879]: Device FW version: 05.12.06.00 [ INFO] [1594725440.072809991]: Device Product ID: 0x0B3A [ INFO] [1594725440.072884936]: Enable PointCloud: On [ INFO] [1594725440.072961325]: Align Depth: On [ INFO] [1594725440.073043770]: Sync Mode: On [ INFO] [1594725440.073217938]: Device Sensors: [ INFO] [1594725440.073382994]: Stereo Module was found. [ INFO] [1594725440.073535625]: RGB Camera was found. [ INFO] [1594725440.073747997]: Motion Module was found. [ INFO] [1594725440.073868757]: (Fisheye, 0) sensor isn't supported by current device! -- Skipping... [ INFO] [1594725440.073952313]: (Fisheye, 1) sensor isn't supported by current device! -- Skipping... [ INFO] [1594725440.074026480]: (Fisheye, 2) sensor isn't supported by current device! -- Skipping... [ INFO] [1594725440.074104980]: (Pose, 0) sensor isn't supported by current device! -- Skipping... [ INFO] [1594725440.074210073]: Add Filter: pointcloud [ INFO] [1594725440.079305660]: num_filters: 1 [ INFO] [1594725440.079525920]: Setting Dynamic reconfig parameters. [ INFO] [1594725440.603827323]: Done Setting Dynamic reconfig parameters. [ INFO] [1594725440.649893637]: depth stream is enabled - width: 640, height: 480, fps: 30, Format: Z16 [ INFO] [1594725440.651105923]: infra1 stream is enabled - width: 640, height: 480, fps: 30, Format: Y8 [ INFO] [1594725440.652207781]: infra2 stream is enabled - width: 640, height: 480, fps: 30, Format: Y8 [ INFO] [1594725440.681688663]: color stream is enabled - width: 640, height: 480, fps: 30, Format: RGB8

[ INFO] [1594725440.700458979]: Expected frequency for depth = 30.00000 [ INFO] [1594725440.716483409]: Expected frequency for infra1 = 30.00000 [ INFO] [1594725440.720687731]: Expected frequency for aligned_depth_to_infra1 = 30.00000 [ INFO] [1594725440.724823182]: Expected frequency for infra2 = 30.00000 [ INFO] [1594725440.729048226]: Expected frequency for color = 30.00000 [ INFO] [1594725440.733129621]: Expected frequency for aligned_depth_to_color = 30.00000

[ INFO] [1594725440.755108587]: insert Depth to Stereo Module [ INFO] [1594725440.755455811]: insert Color to RGB Camera [ INFO] [1594725440.755675849]: insert Infrared to Stereo Module [ INFO] [1594725440.755867554]: insert Infrared to Stereo Module [ INFO] [1594725440.756051407]: insert Gyro to Motion Module [ INFO] [1594725440.756205297]: insert Accel to Motion Module [ INFO] [1594725440.937136851]: SELECTED BASE:Depth, 0 14/07 19:17:20,961 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 300, error: No data available, number: 3d [ INFO] [1594725441.016891329]: RealSense Node Is Up! 14/07 19:17:21,074 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 14/07 19:17:21,124 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 14/07 19:17:21,176 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 14/07 19:17:21,367 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 14/07 19:17:21,418 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 14/07 19:17:21,470 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 14/07 19:17:21,520 WARNING [2770322448] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61

[ERROR] [1594725445.067745600]: An error has occurred during frame callback: Error occured during execution of the processing block! See the log for more info [ WARN] [1594725573.473673650]: No stream match for pointcloud chosen texture Process - Color [ERROR] [1594725575.603129171]: An error has occurred during frame callback: Error occured during execution of the processing block! See the log for more info

MartyG-RealSense commented 4 years ago

Hi @jacky1089 There was a recent case where the error An error has occurred during frame callback: Error occured during execution of the processing block! disappeared when including a camera reset instruction called initial_reset in the launch instruction. Does including it in your launch fix that error for you too, please?

roslaunch realsense2_camera rs_camera.launch initial_reset:=true

jacky1089 commented 4 years ago

I set the parameters according to the method you provided, but I still haven't solved this problem @MartyG-RealSense

At first, I thought it was uvcvideo. I patched Linux, but there was still a problem. According to the document: https://github.com/IntelRealSense/librealsense/blob/master/scripts/realsense-camera-formats.patch

jacky1089 commented 4 years ago

@MartyG-RealSense After Linux is patched, run the dmesg command. The content of uvcvideo is as follows: [ 4.261891] uvcvideo: Found UVC 1.50 device Intel(R) RealSense(TM) Depth Camera 435i (8086:0b3a) [ 4.270673] uvcvideo: Unable to create debugfs 2-2 directory. [ 4.271097] uvcvideo 2-1:1.0: Entity type for entity Intel(R) RealSense(TM) Depth Ca was not initialized! [ 4.271113] uvcvideo 2-1:1.0: Entity type for entity Processing 2 was not initialized! [ 4.271128] uvcvideo 2-1:1.0: Entity type for entity Camera 1 was not initialized! [ 4.271482] input: Intel(R) RealSense(TM) Depth Ca as /devices/platform/scb/fd500000.pcie/pci0000:00/0000:00:00.0/0000:01:00.0/usb2/2-1/2-1:1.0/input/input4 [ 4.271935] uvcvideo: Found UVC 1.50 device Intel(R) RealSense(TM) Depth Camera 435i (8086:0b3a) [ 4.276068] uvcvideo: Unable to create debugfs 2-2 directory. [ 4.276510] uvcvideo 2-1:1.3: Entity type for entity Processing 7 was not initialized! [ 4.276525] uvcvideo 2-1:1.3: Entity type for entity Extension 8 was not initialized! [ 4.276539] uvcvideo 2-1:1.3: Entity type for entity Camera 6 was not initialized! [ 4.277802] usbcore: registered new interface driver uvcvideo [ 4.277813] USB Video Class driver (1.1.1)

MartyG-RealSense commented 4 years ago

It has been recommended that Librealsense is built for Raspberry Pi using the uvc-backend installation method due to problems with building from source on the Pi's Arm processor using the patching-based method.

https://github.com/IntelRealSense/librealsense/blob/master/doc/libuvc_installation.md

jacky1089 commented 4 years ago

I did use the UVC-backend parameter to compile, but the problem still exists. So I patched uvcvideo and the patch started normally. I don't know why.

I need to use d435i as a visual odometer in raspberryPi4, and I need pointcloud data. Under the PC, all functions are normal, but the power consumption is too high, so it can only run on raspberryPi4.

@MartyG-RealSense , can d435i successfully run ROS on raspberry Pie 4 and display pointcloud images through rviz?

If the D435i can't do this, we'll have to buy a lidar.

jacky1089 commented 4 years ago

When I run the realsense-viewer, the 2D/3D pointcloud displays normally.

MartyG-RealSense commented 4 years ago

Pi 4 can work with the D435i. A common pattern in ROS related Pi 4 questions though is that the Pi 4 works fine with Librealsense and programs such as the RealSense Viewer but encounters problems when used with ROS.

Intel have a pre-made Librealsense SD card image for Pi 4 that can be downloaded at the link below. This may act as a stable base to build upon.

https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras#section-2-3-preparing-the-sd-card

MartyG-RealSense commented 4 years ago

Case closed due to no further comments received.

yiyinglai commented 4 years ago

Which ROS distro does this pre-built image contains?

Pi 4 can work with the D435i. A common pattern in ROS related Pi 4 questions though is that the Pi 4 works fine with Librealsense and programs such as the RealSense Viewer but encounters problems when used with ROS.

Intel have a pre-made Librealsense SD card image for Pi 4 that can be downloaded at the link below. This may act as a stable base to build upon.

https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras#section-2-3-preparing-the-sd-card

MartyG-RealSense commented 4 years ago

Hi @yiyinglai I checked the paper and I don't think it uses ROS (use of ROS is suggested as an alternative to RTP/RTSP protocols if the RealSense user wants to adapt the project). So ROS would not be on the SD card image if the default project does not need it.

yiyinglai commented 4 years ago

@MartyG-RealSense thanks for your prompt reply. The issue is regarding point cloud error in ROS so I thought the link you provided has ROS librealsense and RealSense-ros built in.

MartyG-RealSense commented 4 years ago

Hi @yiyinglai ,

@jacky1089 was apparently having some problems with the librealsense installation.

https://github.com/IntelRealSense/realsense-ros/issues/1285#issuecomment-658278253

So I suggested using the SD card image to provide a stable installation of librealsense on their Pi 4 as a foundation to manually build other components upon such as the ROS wrapper. I apologise for any confusion caused.

yiyinglai commented 4 years ago

Hi @MartyG-RealSense , @jacky1089 had no problem running realsense-viewer.
https://github.com/IntelRealSense/realsense-ros/issues/1285#issuecomment-658306045

It is the same in my case: able to run realsense-viewer properly (RGB image and point cloud are available); not able to run rs_camera.launch properly (RGB image and depth image available, point cloud not available).

Can you take a look at https://github.com/IntelRealSense/realsense-ros/issues/1324, where the detalis of my setup are provided?

MartyG-RealSense commented 4 years ago

I'm not currently on duty @yayaneath as it is almost midnight in my time zone. I will be back in 7 hours from the time of writing this and will look at your new case in the morning. Thanks very much for your patience.

wegunterjr commented 4 years ago

so, i have a question on this https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras#section-2-3-preparing-the-sd-card Does it interact easily with the ROS driver available: https://github.com/IntelRealSense/realsense-ros How do i deal with the driver being on the pi now, and have it interact with my ROS install on my machine? Thanks!

MartyG-RealSense commented 4 years ago

Hi @wegunterjr Section 3.5.1 of the ethernet networking white paper suggests that using ROS for networking should be possible.

https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras#section-3-5-design-considerations

As the above link suggests though, creating linkage between ROS and librealsense may not be straightforward. The RealSense ROS wrapper and the librealsense SDK operate independently, with changes made in one not usually affecting the other.

wegunterjr commented 3 years ago

Hi @wegunterjr Section 3.5.1 of the ethernet networking white paper suggests that using ROS for networking should be possible.

https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras#section-3-5-design-considerations

As the above link suggests though, creating linkage between ROS and librealsense may not be straightforward. The RealSense ROS wrapper and the librealsense SDK operate independently, with changes made in one not usually affecting the other.

hmmm.... got ya. saw that too.

MartyG-RealSense commented 3 years ago

You can connect to a camera on the Pi from a central computer by launching the RealSense Viewer program and using the Add Source button to add the camera(s) to the Viewer as a Network Device.

https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras#section-2-7-testing-the-camera

In regard to using ROS with it: when the white-paper mentions ROS, it has the number [10] beside it that leads to a RealSense ROS guide published by Intel for using 3 cameras across 2 computers:

https://github.com/IntelRealSense/realsense-ros/wiki/showcase-of-using-3-cameras-in-2-machines

I do not have any time-schedule estimate to offer about the implementation of IMU streaming on the ethernet network.

wegunterjr commented 3 years ago

@MartyG-RealSense thanks a ton for taking the time to respond. Very cooL! That reference 10 is still looking at cameras with USB connections, but appreciate the info.

MartyG-RealSense commented 3 years ago

You are very welcome @wegunterjr

An alternative to the open-source ethernet white paper's system that you may also find interesting is an earlier RealSense ethernet networking project called EtherSense, whose software is based on Python.

https://github.com/krejov100/EtherSense

https://dev.intelrealsense.com/docs/depth-camera-over-ethernet-whitepaper

Edit: a method of using ethernet with realsense2_camera by configuring ROS_MASTER_URI is detailed in the link below:

https://answers.ros.org/question/357185/intel-realsense-nodelets-running-with-remote-roscore-are-dying-bond-broken/