DJI-Mobile-SDK-Tutorials / Android-VideoStreamDecodingSample

This sample project demonstrates how to use FFmpeg for video frame parsing and to use MediaCodec for hardware decoding on DJI Products.
MIT License
169 stars 80 forks source link

M300 video streaming question #87

Open neilyoung opened 1 year ago

neilyoung commented 1 year ago

Hi,

What I'm currently doing is to obtain the H.264 from the DJI SDK via the "onReceive" callback just to put it back to the SDK using "sendDataToDecoder". The resulting YUV coming to me via "onYuvDataReceived" is then used for further processing.

Now I found this https://sdk-forum.dji.net/hc/en-us/articles/4404231981465-How-to-get-the-stanard-H-264-video-stream-from-M300

This post suggests to use "provideTranscodedVideoFeed" in order to get "real" H.264 (containing SPS and PPS) from the initial H.264.

I'm not sure if that should concern me, that's why I'm asking here: Would I also have to use this "provideTranscodedVideoFeed" callback in order to get proper YUV frames on M300?