Closed sagi-scalers-ai closed 1 year ago
@sagi-scalers-ai We have published a Video Streamer workflow on Intel CPU, is this what you want to do on Flex 140 GPU?
@huiyan2021 Can you point me to Video Streamer workflow? I'm currently working with PyAV to perform GPU-accelerated encoding and decoding using the "h264_qsv" codec. However, during GPU inference using ITEX, there seems to be a conflict as the GPU is occupied by inference, preventing PyAV from executing on gpu.
@sagi-scalers-ai 'To run decode and encode alongside Inference on Flex 140 gpu using ITEX', we had a solution. The whole pipeline is based on Gstreamer. There is Gstreamer decode/encode plugin (run on GPU), and implemente inference as gstreamer plugin as @huiyan2021 pointed the video-streamer.
vacompositor
to combine multiple video stream into one batch for inference to support multple video streams.We are trying to upstream the code into video-streamer repo.
@xiguiw Thank you for your response.
I'm seeking guidance on incorporating decode and encode processes alongside inference on Flex 140 GPUs using ITEX? as well as implementing parallel streams of inference on the same GPU. Any assistance on these aspects would be greatly appreciated.