Set of Python bindings to C++ libraries which provides full HW acceleration for video decoding, encoding and GPU-accelerated color space and pixel format conversions
Apache License 2.0
1.32k
stars
233
forks
source link
How to limit GPU usage when decoding multi-stream #436
Here is my question, I want to decode as many RTSP streams as possible on a single GPU, is there a way to set a threshold for the gpu memory usage per process?
I have tried SampleDecodeRTSP.py and found that the memory consumptions are not always the same. It starts from 189 MiB and then goes up and down ( there is a range from 189 to 411 MiB as far as I observed).
Since we don't know how many frames available in the rtsp live stream, the sample code doesn't do the flush job. I've no idea if this is the main reason which leads to the memory consumption increase. There is only one function I could find for flushing is the FlushSingleSurface. However, it doesn't seem to be the right one I'm looking for.
Hi, thanks for the nice repo.
Here is my question, I want to decode as many RTSP streams as possible on a single GPU, is there a way to set a
threshold
for the gpu memory usage per process?I have tried SampleDecodeRTSP.py and found that the memory consumptions are not always the same. It starts from 189 MiB and then goes up and down ( there is a range from 189 to 411 MiB as far as I observed).
Since we don't know how many frames available in the rtsp live stream, the sample code doesn't do the flush job. I've no idea if this is the main reason which leads to the memory consumption increase. There is only one function I could find for flushing is the
FlushSingleSurface
. However, it doesn't seem to be the right one I'm looking for.Any advice will be greatly appreciated.