luxonis / depthai-ros

Official ROS Driver for DepthAI Sensors.
MIT License
251 stars 185 forks source link

Low FPS on image topics #85

Closed iripatx closed 12 months ago

iripatx commented 2 years ago

Hi,

I'm trying to set up a quick ROS pipeline on which I'm using the rgb_publisher for depthai-ros-examples. Though it works, I noticed that the throughput is awfully low, on an 8 fps average.

Exploring further using rostopic hz, I found that the rgb_publisher node is publishing at that framerate:

average rate: 8.036 min: 0.020s max: 0.212s std dev: 0.04030s window: 6989

Here are some specs about the deployment. Everything is dockerized on a container that extends NVIDIA'S L4T image:

Do I have to specify the framerate anywhere on the node's code? Image topics on the stereo_publisher node publish at 21 FPS and at 36 FPS on the mobilenet_publisher.

Thank you.

Edit:

Using colorCam->setFps(30); on CreatePipeline() has no effect.

saching13 commented 2 years ago

Let me check and get back on this.

saching13 commented 2 years ago

Hello, This looks like POE bandwidth limitation. Mobilenet Publisher sends a cropped Image of size 300x300. Hence it is faster. Where as when it comes to rgb_publisher. It is publishing at 1080p resolution. Which consumes 0.46349 Gigabits for 10 FPS. Hence the low FPS.

iripatx commented 2 years ago

Hello saching, thank you very much for your quick response.

I was wondering, is this bandwidth limitation caused by the ROS nodes publishing uncompressed images, or is it coming from the camera itself? I thought that ROS doesn't put anything on the network unless a node on another machine subscribes to a topic, but I may be wrong.

Should I specify some kind of encoding on the DepthAI pipeline?

Thank you.

saching13 commented 2 years ago

This is POE limitation coming from DepthAI. you can increase the FPS by reducing the size of the image stream. Currently it is 1080p. if you change it to 720 it will be faster for example..

Also try adding the below line. It might reduce the overhead pipeline->setXLinkChunkSize(0)

iripatx commented 2 years ago

I understand. I'll try both options and post the results.

iripatx commented 2 years ago

Since the minimum resolution of the "video" channel in colorCam->setResolution() 1080p, I linked the "preview" channel and changed the resolution there using colorCam->setPreviewSize():

    colorCam->setPreviewSize(640, 400);
    colorCam->setResolution(dai::ColorCameraProperties::SensorResolution::THE_1080_P);
    colorCam->setInterleaved(false);
    colorCam->setFps(40);

    // Link plugins CAM -> XLINK
    colorCam->preview.link(xlinkOut->input);
    return pipeline;

At 1280x720 I still get ~8 FPS. Only when I get to 640x400 I start getting ~37 FPS. Also, pipeline->setXLinkChunkSize(0) didn't make any noticeable change.

I was wondering, should I encode the image (maybe using VideoEncoder)? We may need a better resolution than what we're getting.

saching13 commented 2 years ago

Can you try with the following changes ? it should increase the fps relatively at 720p.

colorCam->setResolution(dai::ColorCameraProperties::SensorResolution::THE_1080_P);
colorCam->setInterleaved(false);
colorCam->setFps(40);
colorCam->setIspScale(2,3);
colorCam->isp.link(xlinkOut->input);
iripatx commented 2 years ago

Only a slight improvement at 9 FPS.

saching13 commented 2 years ago

I tested the same and saw that initially it was good and dropped to 9 FPS. Sorry for the issue and thanks for testing.

Looks like along with the less bandwidth there is an issue with POE communication when it is rgb node. We are working on it to update in the next release of depthai-core.

iripatx commented 2 years ago

Great! Thank you for your help.

dpm-seasony commented 2 years ago

Hi, do you have a timeline for the new release? We are experiencing the same issue on Galactic with an OAK-1-PoE

Could the issue have some relation to the QoS profile? I see that the image topic is set as reliable while the ROS2 guidelines for sensor data recommends using best effort.

saching13 commented 2 years ago

I can make the default to be best effort in the next release. I can update the devel branch of depthai-ros for the same right away. Does that should good ?

saching13 commented 2 years ago

cc: @themarpe on the POE updates in depthai-core

dpm-seasony commented 2 years ago

I can make the default to be best effort in the next release. I can update the devel branch of depthai-ros for the same right away. Does that should good ?

That sounds good. I can test if it makes any difference tomorrow morning

ctieben commented 2 years ago

Hello, I also have the same issue and try these steps out with the same result.

I have an different setup with an notebook plus PoE switch. Platform: Thinpad X1 Gen3 ROS Version: Noetic on Ubuntu 20.04 Camera model: OAK-1-POE

It works well with 30FPS in the SDK demo gui but with ros or the basic api I have the same issue.

(even in this example: https://docs.luxonis.com/projects/api/en/latest/samples/ColorCamera/rgb_preview/#rgb-preview if I change from preview to video)

Edit: I also could run multiple OAK-1-POE and they run at ~8FPS each, so I dont think that this an limitation by the bandwidth of the PoE network itself.

ctieben commented 2 years ago

@saching13 Do you have any idea or hope how to fix this issue?

saching13 commented 2 years ago

Can you update to the new version of depthai-core there might be some improvements?.

It works well with 30FPS in the SDK demo gui but with ros or the basic api I have the same issue.

I think in the gui version of python you are seeing a cropped and smaller resolution version which is making it available at 4K.

ctieben commented 2 years ago

Thanks for the update! With the new version of depthai-core this issues seems to be fixed. At 1080p at least, 4K / 12MP / 13MP are still at about 8-9 FPS but this could be a bandwidth issue.

dpm-seasony commented 2 years ago

Hello, I've updated to the new version of depthai-core with no luck, I still see a ~8 FPS even at 1080p.

RoboEvangelist commented 2 years ago

Folks, Per online documentation we should get 30 FPS at 4K resolution: https://store.opencv.ai/products/oak-d

I have a USB OAK-D. In the python demo I installed with pip I can get the 30 FPS at 4K

RoboEvangelist commented 2 years ago

Here is sample code: resolution

saching13 commented 2 years ago

Hello, I've updated to the new version of depthai-core with no luck, I still see a ~8 FPS even at 1080p.

that means your PoE or ethernet connection might be throttling..

saching13 commented 2 years ago

@RoboEvangelist your sample code is reading a preview whose resolution is 500x500.

samialperen commented 2 years ago

@saching13 is there any update on this? I am using OAK-D-PRO USB version with stereo_inertial_publisher.launch I can not see any parameters in the launch file to increase color fps.

Luxonis-Brandon commented 2 years ago

Sorry Sachin is offline currently but pinging @daxoft in case he can help.

samialperen commented 2 years ago

Guys, is there any update on this? The best I could get is still around 26 fps with 720x1280 resolution rostopic hz /stereo_inertial_publisher/color/image. I modified stereo_intertial_publisher.cpp in a bit and added camRgb->setFps(40); , but still no luck.

Luxonis-Brandon commented 2 years ago

Sachin is coming back online. Sorry about the delay. Not sure why @daxoft didn't reply, asking offline.

@saching13 - thoughts here?

Oh and what does your CPU usage look like when doing this @samialperen ? Uncompressed depth can consume a lot of bandwidth...

saching13 commented 2 years ago
720x1280 * 3 Bpp = 2764800 * 18fps = 49766400 (RGB Color Stream)
720x1280 * 2 Bpp = 1843200 * 30fps = 55296000 (Depth stream)
416x416   * 3 Bpp = 519168 * 16fps   = 8306688 (Preview stream for Object detection)

The total of above is 113369088 Bytes -> 113.369088 MBps. So I think that is expected. if you need only RGB at higher FPS then I think we can get 40FPS on RGB.

image

Thoughts ?

samialperen commented 2 years ago

@saching13 @Luxonis-Brandon thanks a lot for the info guys. Is there any chance that I can get both color, left, right and stereo/depth (or depthCompressed with png level 1) all together with fps 30 or over? I have tested my code on oak-d-pro (USB) and I could not achieve this, but I have not tested it on oak-pro-w (POE)

saching13 commented 2 years ago

I don't think that would be possible at the current resolution. May be if we scale to a lower resolution we might be able to get closer. But still hard to get all of them at 30 in PoE I think. @alex-luxonis can correct me.

samialperen commented 2 years ago

@saching13 thank you very much! If it is not too much work for you guys, would be possible to have a table showing max fps we can get for each resolution once everything is enabled. Something similar to one in this page: https://docs.luxonis.com/projects/api/en/latest/components/nodes/stereo_depth/#stereo-depth-fps

I think it would be really useful for the community. Thank you!

saching13 commented 2 years ago

Will do. Thanks and sorry for the issue. cc: @Erol444 on documentation.

samialperen commented 2 years ago

@saching13 No worries at all man. Thanks a lot for the help! I will try to contribute to depthai-ros-examples repository to add a launch file which has parameters to publish all topics if it is required. At the moment, it is missing a launch file that is publishing color, left, right, depth at the same time.

saching13 commented 2 years ago

Yes. that was done to avoid even more drop in the bandwidth. Feel free to add. I just moved the depthai-ros-examples to depthai-ros repo. Please make the contributions there. I will be deprecating the other one.

samialperen commented 2 years ago

Sure. Thanks for the heads up!

ctieben commented 2 years ago

Will it be possible to encode the video stream at the device (https://docs.luxonis.com/projects/api/en/latest/samples/VideoEncoder/rgb_encoding/) and than as FFMPEG video stream messages into ROS (see: https://github.com/daniilidis-group/ffmpeg_image_transport) to avoid the limitations of bandwidth of the PoE devices?

samialperen commented 2 years ago

@ctieben hmm, interesting idea! What do you think @saching13? Would it be technically possible? If it is, I can try put some work on it.

saching13 commented 2 years ago

I think that could work depending on the use case and resources available. Since it will need additional allocation of memory on device and also additional processing on host for decoding on the subscribers.

ctieben commented 2 years ago

Hi there, I just try the solution to encode the video stream to reduce the bandwidth of the PoE cameras and it works pretty well.

With this I could stream 4K @ 28FPS as (M)JPEG which will allow to use the standard compressed images of ROS. In this scenario I have a average bandwidth of 65MB/s for the datastream in ROS.

With H25x it will go down to avg. 2.5 MB/s

samialperen commented 2 years ago

@ctieben, this is great! I did not have enough time to work on it. I wonder whether you can share how you did it.

saching13 commented 2 years ago

Thanks for trying it out @ctieben Let me know if either of you would be up for making a PR for this. Or else will add this to the feature list to be added.

ctieben commented 2 years ago

Hi, sorry I missed to send the code snippet in the first pace. Here the code I used to try it out:

{
    // Create pipeline
    dai::Pipeline pipeline;

    // Define sources and outputs
    auto camRgb = pipeline.create<dai::node::ColorCamera>();
    auto videoEnc = pipeline.create<dai::node::VideoEncoder>();
    auto xout = pipeline.create<dai::node::XLinkOut>();

    xout->setStreamName("mjpeg");

    // Properties
    camRgb->setBoardSocket(dai::CameraBoardSocket::RGB);
    camRgb->setResolution(dai::ColorCameraProperties::SensorResolution::THE_4_K);
    videoEnc->setDefaultProfilePreset(30, dai::VideoEncoderProperties::Profile::MJPEG);

    // Linking
    camRgb->video.link(videoEnc->input);
    videoEnc->bitstream.link(xout->input);

    // Connect to device and start pipeline
    dai::Device device(pipeline);

    // Output queue will be used to get the encoded data from the output defined above
    auto q = device.getOutputQueue("mjpeg", 30, true);

    auto pubImgCompr = pnh.advertise<sensor_msgs::CompressedImage>("color/compressed", 30);

    int seq = 0;
    while(pnh.ok()) {
        auto jpegPacket = q->get<dai::ImgFrame>();
        std::cout << "Get Frame!" << std::endl;

        sensor_msgs::CompressedImage outImageMsg;

        outImageMsg.header.stamp = ros::Time::now(); //getFrameTime(ros::Time::now(), std::chrono::steady_clock::now(), jpegPacket->getTimestamp());
        outImageMsg.header.seq = seq++;
        outImageMsg.header.frame_id = "cam";
        outImageMsg.format = "jpeg";

        outImageMsg.data = jpegPacket->getData();

        pubImgCompr.publish(outImageMsg);

        ros::spinOnce();

    }

    return 0;
}

It will take some time for me to integrate this into the bridge but I will try it within the next weeks and could create a PR for this.

samialperen commented 2 years ago

@saching13 @ctieben I am trying to do this on mono camera to increase mono and depth FPS. Here is the code I came up with: https://pastebin.com/LiKq6PEn

I am getting this error

[184430104159671200] [192.168.123.21] [96.367] [StereoDepth(2)] [error] Left and right input images must have the same width! Skipping frame!
[184430104159671200] [192.168.123.21] [96.367] [StereoDepth(2)] [error] Maximum supported input image width for stereo is 1280. Skipping frame!
[184430104159671200] [192.168.123.21] [96.367] [StereoDepth(2)] [error] Left input image stride ('0') should be equal to its width ('120622'). Skipping frame!
[184430104159671200] [192.168.123.21] [96.400] [StereoDepth(2)] [error] Left and right input images must have the same width! Skipping frame!
[184430104159671200] [192.168.123.21] [96.400] [StereoDepth(2)] [error] Maximum supported input image width for stereo is 1280. Skipping frame!
[184430104159671200] [192.168.123.21] [96.400] [StereoDepth(2)] [error] Left input image stride ('0') should be equal to its width ('113695'). Skipping frame!

I believe it is about linking video encoder output (bitstream) to stereo input, but I could not find any documentation on how to do it properly.

ctieben commented 2 years ago

Hi @samialperen I'm not the expert but I looks like that you try link the encoded data stream into the stereo matching which will not work.

You should encode the data stream before you link it to the output (xout).

samialperen commented 2 years ago

Hi @ctieben , thanks a lot. Now, I got rid of stereo matching part (which eliminated the error), but I do not know how I can just encode monoLeft and monoRight and use dai bridge. I am kinda following this tutorial: https://docs.luxonis.com/projects/api/en/latest/samples/VideoEncoder/rgb_mono_encoding/#rgb-mono-encoding without color, but I do not get anything published on ROS after adding necessary stuff.

saching13 commented 2 years ago

Ex: instead of monoLeft.out.link(ve1.input) do stereo->syncedLeft.link(ve1.input) or stereo->rectifiedLeft.link(ve1.input) if needed on rectify ones. But keep it optional.

And you might also need to add another rosConverter to ImageConverter.

ctieben commented 1 year ago

Hi there, It's absolutely great to say that the low bandwidth mode was integrated in the driver (with mjpeg compression). This should solve this issue!

One last question: Will it also be able to transport the compressed images to don't stress the host system by converting it back to raw (if not needed, like for recordings).

PS: It's also great to see dynamic parameters for ros 1 now 👍

Serafadam commented 12 months ago

Closing due to inactivity, please reopen if there are still questions. Regarding comment above, publishing compressed images without converting them to raw - it's added to the roadmap