Closed zamirkhan closed 5 years ago
The sample app code uses the Rendering video event to capture the bitmap, paint an audio level chart and display it. Is that not working you? As far as the first frame goes, you can simply count how many times the renderingvideo event gets called.
Closing, since no further feedback was provided.
After loading a video, I want to convert the first frame (and subsequent frames, if traversed) to a Bitmap and do some processing. I am able to capture subsequent frames by calling
CaptureBitmapAsync
after thePositionChanged
event. However, I have so far been unable to find an event that occurs right after the video has been loaded whereCaptureBitmapAsync
does not returnnull
. I've triedMediaReady
,VideoFrameDecoded
, and more. What am I missing? Thanks in advance.