Closed AndreFrelicot closed 5 years ago
NB: at the same time I decode the H264 packet using the Player for testing purposes and the decoded image is correct (except the size which is 8pix wider, strange !)
i could try to explain it in detail but its probably a better strategy to debug through the non-opengl image transformation. it is done in javascript and draws on a canvas. you should be able to copy that code or at least defer from it the way you need to go about it.
the 8pix comes probably from the fact that broadway always gives results in a resolution that is devisible by 16. so if you know the resolution of your video you will be able to cut the dimensions of the result accordingly
hope this helped
Yes I've already walked through the renderFrameRGB, my code is inspired by it. I've also tested with putImageData but the result is still scrambled. I'll review it again and find what I'm doing wrong.
remember the raw output is in YUV color and in the 420 bit format. so you have 3 planes of picture data. read up on YUV420 if you are confused. there is tons of docs about it
Thank you, fixed it by converting the raw data with these functions:
https://gist.github.com/ryohey/ee6a4d9a7293d66944b1ef9489807783
Hello, I would like to get a ImageBitmap after the onPictureDecoded is fired. But when I try to convert the raw data to ImageBitmap I get a scrambled image. The returned properties width and height are correct but not when drawn on a canvas.
How should I interpret the raw format returned by onPictureDecoded ?
I need to work this way because I want to draw the frame inside a BabylonJS texture (WebGL engine). I don't want to use the Player.