Open cubatertech opened 7 years ago
Sure, you should generate a OpenGL texture from VideoRenderer.I420Frame, And draw to the CGEFrameRenderer every frame. Follow the steps:
- Create a TextureDrawer(org.wysaid.common.TextureDrawer), and keep it.
- Create a texture with Common.genBlankTextureID(frameWidth, frameHeight), and keep it.
- Call glTexSubImage2D with your VideoRenderer.I420Frame data(or buffer), to update the content of the texure. (There may be other functions, just make VideoRenderer.I420Frame as a texture.
- Just call CGEFrameRenderer(instance).bindImageFBO(), set glViewport, and then use the drawer to draw the texture.
- Apply filters...
Thank you very much dear for your quick response.
I'll check & send back ACK to you. Thanks
Dear have you any example of this process...
Maybe later
Thank you.
I really Stuck in problem last two days ago. & not getting any success.
Hi @cubatertech Do you still get stuck there? I'm thinking to add a new camera demo with data preview.
Hi, I want to generate preview from VideoRenderer.I420Frame in webRTC stream using some filters. Is it possible to we use GPUImage plus lib for this purpose please give me some tips for this. Because if we generate a bitmaps from the stream & apply filters or perform any other processing on stream then it will not sync with Audio. Feel much delay in frames. Thanks