rockchip-linux / kernel

BSP kernel source
Other
891 stars 1.06k forks source link

Hardware acceleration with FFmpeg for RK3328 #224

Open gusarg81 opened 3 years ago

gusarg81 commented 3 years ago

Hi,

Is amazing that being there for a quite time that is no possible out of the box, easily, to have hardware acceleration with FFmpeg, Which is a complete, cross-platform solution to record, convert and stream audio and video so popular within these days, used in so many apps.

That being said, I have Rock64 SBC (with RK3328 CPU), that I want only to stream 3 RTSP cams using FFmpeg (currently running Armbian Focal with legacy kernel, 4.4.213). What is important is that one of those cams is to monitor my baby that needs 24hs supervision because his health condition and these days we need to use what we have because nothing is cheap (and event less, here in Argentina).

So, there many supposed howto's but everyone is far for being complete steps to make it work. What I need is all the steps, libs from rockchip repos (and others if needed) that needs to be compiled (all its necessary compilation flags ) to make hardware acceleration to work with FFmpeg (and besides --enable-rkmpp flag, if something else needed here).

So far what I know what is necessary is (besides the supported kernels, 4.4.x and 4.19.x) and correct me is something missing or that is not necessary from this list:

Again, remember that this is to make FFmpeg hardware acceleration to work (and I know, if I am not wrong, that rockchip only support decoding though FFmpeg and encoding must be done though gstreamer).

I know this is not the best place to report this, but I think there is no other that comes from Rockchip.

Thanks in advance.

gusarg81 commented 3 years ago

Still no reply to this?...

andir commented 3 years ago

I am in a similar boat as you. I'm using a RockPi 4B and for all I care I'd like to play h246 content with hardware acceleration. Recently I stumbled upon the v4l driver in mainline kernels staging section and https://github.com/Kwiboo/FFmpeg/commits/v4l2-request-hwaccel-4.2.2 from @Kwiboo. I am not yet entire sure what combination of custom tools I'll need to test this or get this going.

I hope all I need is the v4l user space API, a recent enough kernel (currently running 5.10-rc2) and maybe a patched ffmpeg (until upstream accepts those patches).

gusarg81 commented 3 years ago

Is a very badly documented of how to achieve this in different scenarios. And, if you made it possible, does not work right. For example, I made possible to use decoder with FFmpeg, and the decoded videos look so ugly, with so many artifacts... For now I give up till I have extra time to report this problem (yeah, because you don't even have a single reply).

bdherouville commented 2 years ago

Hi,

There is some dicussion about Rockchip acceleration here : https://github.com/blakeblackshear/frigate/pull/1814

Hope we will be able to get the max performance of our boards.

Cheers,

gusarg81 commented 2 years ago

@bdherouville Hi!

Nice, I will take a look at it. Also, I've never heard about that NVR (Frigate). I use Shinobi (and in the past Motion/MotionEye).

I have my Rock64 stored around there just because I can't get hardware accel to work right (encoding/decoding). Because also I want to start a project, a IP Doorbell (with face recognition and many more) and the only harware to test (small) is this one.

In the time that we get Rockchip to work with Hwaccel, will become obsolete (which, kind the one from Rock64 already is).

Maigre commented 2 years ago

Ahah i am exactly in the same boat (but with RK3399)... hard to tell how to properly configure everything.. But following the issue !

gusarg81 commented 2 years ago

@Maigre read the long post from there https://github.com/blakeblackshear/frigate/pull/1814#issuecomment-939294837. I am following those instructions.

gusarg81 commented 2 years ago

Well, anyways seems the path is not rkmpp anymore, since that is for legacy kernels. New kernels 5.13 and newer is via v4l2-request and libdrm. But this is too green yet, still did not figure it out how to make it work correctly, there are to many forks of FFmpeg, not a single form of instruction of how to make it work, etc etc etc.

We have to wait, but in the meanwhile this board is getting old and old.

avafinger commented 2 years ago

@gusarg81 If you just want to stream RTSP, take a look at: https://github.com/avafinger/nanopi-r4s-minimal-image#h264-camera The original code is for H264 since HTML5 does not support H265. This will work on any platform.

gusarg81 commented 2 years ago

@avafinger Hi,

Thanks, I will take a look (specially for videostreamer). By the way, seems decoding is working now (started from scratch with Armbian and following again the post from blakeblackshear/frigate#1814

Since that is solved, now comes the encoding part. My original idea is to use rtsp-simple-server (https://github.com/aler9/rtsp-simple-server), and for that I need to use encoding. So, is there a way to achieve hwaccel encoding?

avafinger commented 2 years ago

Encoding works on legacy kernel (GStreamer), not with FFmpeg. You need rockchip mpp, gstreamer, rga.

videostreamer is nice but no sound. If you want video + audio you need to make some changes to add the audio and change MP4 to MKV.

Regarding RTSP from cams, usually, you can get video + audio with VLC / FFPLAY from your host PC, or if you figured out how to decode, add the latest SDL2 and you can work with ffplay. See the commands here: https://github.com/avafinger/onvif_ip_camera_dome And if you stick to the legacy kernel you need to comment the line: https://github.com/libsdl-org/SDL/issues/4879#issuecomment-950439100

avafinger commented 2 years ago

@gusarg81 And if you want to stream video from a USB cam (MJPEG) to achieve 30 FPS, use mjpg-streamer and make sure your sensor delivers MJPEG, if it delivers JPEG you need a fix like this one: https://github.com/avafinger/bananapi-zero-ubuntu-base-minimal/issues/56#issuecomment-800451720

gusarg81 commented 2 years ago

The USB UVC camera I use, does support MJPEG and YUYV:

v4l2-ctl --list-formats -d /dev/video1 ioctl: VIDIOC_ENUM_FMT Type: Video Capture

    [0]: 'MJPG' (Motion-JPEG, compressed)
    [1]: 'YUYV' (YUYV 4:2:2)

And the idea, as mentioned once, is to use MJPEG because is the format that supports 30FPS in FHD resolution. So, using this format to stream as RTSP at first and also as HLS format (this last one, to make it easier to capture it via web in a platform that I will develop later). Thats why the idea was rtsp-simple-server.

EDIT: just to mention, the idea is not to stick with legacy kernel because there is no progress at all, and things works read bad; I've tested it many times, no even a decent hw decoding.

avafinger commented 2 years ago

RTSP: Audio Codecs: AAC, AAC-LC, HE-AAC+ v1 & v2, MP3, Speex, Opus, Vorbis Video Codecs: H.265 (preview), H.264, VP9, VP8

Read carefully what I have written, I am sure it will save you time.

I don't know the state of the HW encoder in the mainline.

gusarg81 commented 2 years ago

mjpg-streamer works nice by the way, and very low CPU usage.

Seems that will do for now (event http output is quite nice, even stream delay is minimal).

Now i must think for the sound part. This USB camera I use, which is a Arducam https://www.arducam.com/product-category/uvc-usb-camera-module/usb-uvc-cameras-night-vision/, does not have sound, so my idea is to add a USB Mic.

And, I way to mix the stream (mjpeg-streamer) and the audio input.