IntelRealSense / librealsense

Intel® RealSense™ SDK
https://www.intelrealsense.com/
Apache License 2.0
7.53k stars 4.81k forks source link

RuntimeError: Frames didn't arrive within 10000 #13272

Closed haidaumitom closed 1 week ago

haidaumitom commented 3 weeks ago

Required Info
Camera Model D435
Firmware Version 5.16.0.1
Operating System & Version Ubuntu 22.04 Server and ROS2 Humble
Kernel Version (Linux Only) 6.8.0-40-generic
Platform Raspberry Pi 4 model B
SDK Version 2.55.1.0
Language Python

Issue Description

I installed librealsense with the LibUVC-backend installation method. The goal of my project is to create a ROS2 Publish Node on Raspberry Pi 4 (Ubuntu server 22.04) and a ROS2 Subscriber Node on my laptop (Ubuntu 22.04 desktop). After that, received data will be published to the Subscriber Node and be displayed on Rviz2 for further works. The Raspberry Pi 4's only goal is to publish all data to the laptop.

Here is my subscriber node file on the Pi 4:

import rclpy from rclpy.node import Node
from sensor_msgs.msg import Image from cv_bridge import CvBridge
import pyrealsense2 as rs import numpy as np
import copy

class RealSensePublisher(Node): def init(self): super().init('realsense_publisher')

    # ROS 2 Publishers
    self.color_publisher = self.create_publisher(Image, 'color_image', 10)  
    self.depth_publisher = self.create_publisher(Image, 'depth_image', 10) 

    self.bridge = CvBridge()  

    self.pipeline = rs.pipeline()  
    self.config = rs.config()  
    self.config.enable_stream(rs.stream.depth, 640,480 , rs.format.z16, 15)  
    self.config.enable_stream(rs.stream.color, 640,480 , rs.format.bgr8, 15)  
    self.pipeline.start(self.config)  

    self.timer = self.create_timer(0.1, self.timer_callback) 

def timer_callback(self):
    try:
        frames = self.pipeline.wait_for_frames(10000)  
        depth_frame = frames.get_depth_frame()  
        color_frame = frames.get_color_frame()

        if not depth_frame or not color_frame: 
            self.get_logger().warn("No frame received, skipping this iteration")  
            return  

        depth_image = np.asanyarray(depth_frame.get_data())  
        color_image = np.asanyarray(color_frame.get_data())  

        depth_msg = self.bridge.cv2_to_imgmsg(depth_image, encoding="passthrough") 
        color_msg = self.bridge.cv2_to_imgmsg(color_image, encoding="bgr8")  

        self.depth_publisher.publish(depth_msg) 
        self.color_publisher.publish(color_msg)  #

   except RuntimeError as e:  
        self.get_logger().error(f"RuntimeError encountered: {str(e)}")  
        self.get_logger().warn("Skipping this frame due to error")  

def send_to_processes(self, rgb, depth):

    while True:
        rgb_ = copy.deepcopy(rgb) 
        depth_ = copy.deepcopy(depth)  

        for queue in processes.values(): 
               send(queue, {'rgb': rgb_, 'depth': depth_}) 

def main(args=None): rclpy.init(args=args)
node = RealSensePublisher()
rclpy.spin(node)
node.destroy_node()
rclpy.shutdown()

if name == 'main': main()

When I run the file, it seems to get the RuntimeError as written in the file: [ERROR] [1724138568.671391456] [realsense_publisher]: RuntimeError encountered: Frame didn't arrive within 10000 [WARN] [1724138568.675346266] [realsense_publisher]: Skipping this frame due to error

The camera and depth sensor seem to be working really fine as I can see red flashes in it. In rviz2, the image topics are recognized but "No image" is shown.

Screenshot from 2024-08-20 14-29-08

I even run command ros2 node list, the node is created and recognized. Screenshot from 2024-08-20 14-31-09

I have read many issues (including #6628) stating that the error RuntimeError: Frames didn't arrive within 10000 is related to the cable and unplug-replug would help, but in my case none of the solution is helpful.

If anyone was stuck in such case like mine has found a solution, I would be appreciate a lot to be helped with. Thanks for reading!

MartyG-RealSense commented 3 weeks ago

Hi @haidaumitom As your research has likely shown, RealSense cameras tend to have problems when used with Raspberry Pi boards. The best that can be hoped for is to stream depth and color alone, with no alignment, pointclouds or post-processing. Often, the situation that you have experienced will occur - depth will stream normally but RGB color frames will not be received.

Usually there is not a solution when the color frames are not being delivered, except for using the RGB stream from the left infrared sensor instead of the RGB sensor. The D435 model does not support this 'color from left infrared sensor' feature though as it is only on the D405, D415 and D455-type models (D455, D456, D457, etc).

However, a RealSense user at https://github.com/IntelRealSense/realsense-ros/issues/3011#issuecomment-1951472270 who was also using a Pi with Python and a ROS2 node achieved a successful solution for streaming depth and RGB color and shared it.

MiTomMi commented 3 weeks ago

Hi @MartyG-RealSense, thanks for your reply.

As you said that depth_image would would works normally but RGB color frames may not be received, I have tried streaming only depth frames and it was displayed really well. However, streaming RGB color frames alone did not succeed.

Additionally I also tried Henryburon's method you mentioned above but it still not worked. I think I'll moving to another camera model instead of D435.

Thank you for your help! I really appreciate that.

MartyG-RealSense commented 3 weeks ago

Hi @MiTomMi You are very welcome. Thanks very much for the update. I'm sincerely sorry that you did not get the outcome that you were aiming for.

MartyG-RealSense commented 1 week ago

Case closed due to no further comments received.