jc-kynesim / hello_drmprime

Simple ffmpeg h265 drm output prog
24 stars 9 forks source link

Is it possible to convert this for DMABUF pipeline? #1

Open Fredrum opened 3 years ago

Fredrum commented 3 years ago

Hello! Just tried this on my RPi4b and it runs beautifully smooth. (I know you't not pacing it). Would it be possible to convert this to do a zero or one copy dmabuf-to-opeGL texture pipeline? Like what I have here for v4l2 camera video, https://github.com/Fredrum/rpi_v4l2_tests/blob/master/glDmaTexture.c

If it is would you have any broad pointers?

Cheers!

jc-kynesim commented 3 years ago

Yes - the code is all but identical - but beware that the current mesa only copes with normal YUV/NV12 textures it doesn't cope with SAND8 or SAND30 which is what H265 decode gives - HoT mesas should cope with SAND8 (8-bit) but SAND30 (for 10-bit) is still in the works and due to its unique format it isn't clear if it will be possible on Pi4. Also beware that the GL h/w is good to HD but 4k @ useful Hz is beyond it. If you want a pointer grab my ffmpeg repo https://github.com/jc-kynesim/rpi-ffmpeg.git branch test/4.3.2/rpi_main and look at egl_vout.c

Fredrum commented 3 years ago

Hi and thank so much for the notes! I haven't followed all the hevc progress super closely so I'll have to look into SAND etc. I have a use case that doesn't need 4k so I could just start with that. Or maybe I can swap that over to DRM instead although that project is currently using some X11, which I might be able to get rid of.

So does drm(prime?) rendering do 4K fine on RPi4?

jc-kynesim commented 3 years ago

drm does 4k fine - it points the display h/w directly at the decoded buffer so no copying or anything else is required which is just as well as any 4k frame copy or convert takes too much time on a Pi4.

Fredrum commented 3 years ago

Cool, I think I'll explore that path! :) Cheers!

cbratschi commented 2 years ago

@Fredrum did you get any further with this? I need hardware accelerated video support in OpenGL too and could not find any implementation for the RPi 4.

jc-kynesim commented 2 years ago

The SAND8 EGL support should be backported Real Soon Now to mesa (but see existing caveats re. performance), SAND30 is still way off. If you want a better (fullscreen only) DRM output framework then take a look at my drmu project (in particular the hello_drmu it builds), though beware that the API is still in flux if you want to use it; it is what I'm going to be using in VLC.

cbratschi commented 2 years ago

I maintain this project: https://github.com/cbratschi/aminogfx-gl

For Pi 3 I implemented an OpenMax based video player and started a Pi 4 implementation two years ago which did not get far for video. Now I want to resume that work to get video support in OpenGL/EGL. FFmpeg is already used as demuxer. I hope to find a solution to get the decoded video frames to OpenGL buffers. Performance was always limited through OpenGL but DRM planes are not flexible enough to composite complex user interfaces.

jc-kynesim commented 2 years ago

Can't argue with the DRM inflexibility. As I said earlier in this thread look at egl_vout.c in my rpi-ffmpeg repo for how to get a drmprime frame on to the screen via egl. This currently only works for h264_v4l2m2m decoded frames as they are YUV or NV12 but should soon work for 8-bit H265; 10-bit H265 isn't on the cards at the moment.

Fredrum commented 2 years ago

Hi yes it's working great for my project both kmsdrm rendering and ffmpeg->OpenGl over the 'zero copy' path. It ends up using quite a lot more CPU than the old MMAL/ILClient way but I haven't looked supercloesly as to where that's happening. I'm suspecting somewhere in ffmpeg.

Like jc-k says the Raspberry people are saying that the Mesa back port have been done and is with their testing-before-it-goes-out team now. I have been using own Mesa builds using versions equal or higher to 21.1.0.

And as I understand it jc-k's ffmpeg PI specific improvements haven't gone upstream to that project and therefore I haven't been able to build my project on Ubuntu yet. So you'd have to build custom ffmpeg too if you want to support that, probably.

Here's a video of the project I've been working on. See last 1/3 for it running on OpenGL (gles2) under X11 desktop. https://www.youtube.com/watch?v=qDthHmD30us

EDIT: For performance in the video you can see the program running well on a Pi Zero2 W, doing DRM rendering without X11. That's a 720p video stream but I would guess that it could do 1080p fine too. I just don't have a Playstation capable of outputting 1080p.

cbratschi commented 2 years ago

h264 playback is my first target, h265 later.

I tried to get the V4L2 code to run but the av_hwdevice_ctx_create() call always returns -14 "Bad address". I could not find this error message anywhere.

The code in this repository seems to require your patched FFmpeg version or it fails because av_frame_cropped_width() is not defined.