hi all,
I have a basic pipeline.
1- Create an mp4 video from RGB images with FFmpeg
2- Decode this mp4 video in (chrome)web browser with webcodec (using your sample code)
I realized that, when I decode the frames, there are missing three frames every time. These are the last three frames.
This sounds like a Chrome bug - while this tracker is for the specification. If you're calling flush() and still not seeing those frames, feel free to open a bug a https://crbug.com/new
hi all, I have a basic pipeline. 1- Create an mp4 video from RGB images with FFmpeg 2- Decode this mp4 video in (chrome)web browser with webcodec (using your sample code)
I realized that, when I decode the frames, there are missing three frames every time. These are the last three frames.
here is my encoding script for step 1:
so you can decode with FFmpeg to check the frames (everything is ok, there are no missing frames)
to decode
ffmpeg -i output.mp4 thumb%04d.jpg
and for step 2 as I said, I use this sample ( https://github.com/w3c/webcodecs/tree/main/samples/mp4-decode ) but it's not possible to extract the last three frames.
Is there any way to solve this problem? maybe a config file or something like that?