Closed wangning08115 closed 5 years ago
You can check out how it's done in this project -- it's slightly complex but you can follow through the sample app and step through the code. You will notice that you need to create a GPU device context, send the frame containing packet data to the GPU and then transfer the frame from the GPU and on to the RAM -- after that, you will need to convert the frame from GPU specific colorspace to a colorspace that will work for you.
Search my code for the following function. The key is to examine the ExchangeFrame call to the HardwareAccelerator object.
protected override MediaFrame CreateFrameSource(IntPtr framePointer)
Now, what I described is one way of doing it -- there is also a simpler way of doing it (if the codec is available directly) -- Example:
// C/C++ example to use the DXVA2 decoder
AVCodec* decoder = avcodec_find_decoder_by_name ("h264_dxva2");
avcodec_open2 (decoder_ctx, decoder, NULL);
As far as providing sample code for your specific purposes, I apologize but I would consider this outside the scope of this project.
How to decode MP4 file use GPU?
I use the demo of FFmpeg.AutoGen. It is good for code: private static unsafe void DecodeAllFramesToImages() But this is use CPU to decode. I want get a demo use GPU to decode. How can i do this ?
This is FFmpeg.AutoGen demo: