prominenceai / deepstream-services-library

A shared library of on-demand DeepStream Pipeline Services for Python and C/C++
MIT License
280 stars 63 forks source link

Handling Buffer Accumulation in DeepStream Pipeline with Slow Inference Speed (app-source) #1140

Open YoungjaeDev opened 8 months ago

YoungjaeDev commented 8 months ago

I previously shared a setup where I read an image buffer from a Redis server, converted it into a GStreamer buffer, and then created a DeepStream pipeline through an app-source. The buffer is ultimately pushed using this C++ code:

DslReturnType retval = dsl_source_app_buffer_push(L"app-source", buffer);

Now, in this setup where I'm using a player(interpipe-source) for DeepStream inference, I'm concerned about the scenario where the inference speed might be slower than the image push speed. In such a case, would image buffers continuously accumulate in a certain queue? If yes, is there an existing structure that allows for the dropping of these accumulating buffers to prevent overload? I'm currently using a fake-sink.

Thank you

rjhowell44 commented 8 months ago

@youngjae-avikus I plan to add two new component services to allow you to set the max-size of the input queue to any component... and control the leaky property. See https://gstreamer.freedesktop.org/documentation/coreelements/queue.html#queue:leaky. With this you can set the queue to drop (leak) either the oldest or newest buffers when max queue size is reached.

Let me know if this meets your needs.

YoungjaeDev commented 8 months ago

@youngjae-avikus I plan to add two new component services to allow you to set the max-size of the input queue to any component... and control the leaky property. See https://gstreamer.freedesktop.org/documentation/coreelements/queue.html#queue:leaky. With this you can set the queue to drop (leak) either the oldest or newest buffers when max queue size is reached.

Let me know if this meets your needs.

It seems to be true that things are accumulating in the queue. In my case, maybe I set max-queue-size is 1, and my goal would be to inference the latest buffer at any time. And it would be nice if there was an alarm if system was to be pushed back.

rjhowell44 commented 4 months ago

This work is now covered by issue #1225