intel / intel-extension-for-tensorflow

Intel® Extension for TensorFlow*
Other
317 stars 40 forks source link

How to run decode and encode alongside Inference on Flex 140 gpu using Itex #45

Closed sagi-scalers-ai closed 1 year ago

sagi-scalers-ai commented 1 year ago

I'm seeking guidance on incorporating decode and encode processes alongside inference on Flex 140 GPUs using ITEX? as well as implementing parallel streams of inference on the same GPU. Any assistance on these aspects would be greatly appreciated.

huiyan2021 commented 1 year ago

@sagi-scalers-ai We have published a Video Streamer workflow on Intel CPU, is this what you want to do on Flex 140 GPU?

sagi-scalers-ai commented 1 year ago

@huiyan2021 Can you point me to Video Streamer workflow? I'm currently working with PyAV to perform GPU-accelerated encoding and decoding using the "h264_qsv" codec. However, during GPU inference using ITEX, there seems to be a conflict as the GPU is occupied by inference, preventing PyAV from executing on gpu.

huiyan2021 commented 1 year ago

https://github.com/intel/video-streamer/blob/main/DEVCATALOG.md

xiguiw commented 1 year ago

@sagi-scalers-ai 'To run decode and encode alongside Inference on Flex 140 gpu using ITEX', we had a solution. The whole pipeline is based on Gstreamer. There is Gstreamer decode/encode plugin (run on GPU), and implemente inference as gstreamer plugin as @huiyan2021 pointed the video-streamer.

  1. To decode/encode on GPU, you want to build gstreamer from source code with vaapi enabled. There is no binary released (due to licenses problem).
  2. To implemente a gstreamer ITEX Inference plugin, you can refere to video-streamer.
  3. Also, you need vacompositor to combine multiple video stream into one batch for inference to support multple video streams.

We are trying to upstream the code into video-streamer repo.

sagi-scalers-ai commented 1 year ago

@xiguiw Thank you for your response.