Sunoo / homebridge-camera-ffmpeg

Homebridge Plugin Providing FFmpeg-based Camera Support
https://sunoo.github.io/homebridge-camera-ffmpeg/
Apache License 2.0
1.08k stars 225 forks source link

Hardware Acceleration Using VAAPI (Intel GPUs) #1223

Open washcroft opened 2 years ago

washcroft commented 2 years ago

Thanks to @mickgiles for the inspiration in #508, I wanted to share what worked for me in newer versions (v3) of homebridge-camera-ffmpeg. If running in docker, remember to pass through your /dev/dri:/dev/dri devices.

Obviously you'll need a working copy of ffmpeg that has been built with VAAPI (h264_vaapi) support. Lots of resources for compiling your own out there.

ffmpeg.sh Script The script acts a middle man between homebridge and ffmpeg. This is needed to fix a few hardcoded elements in the ffmpeg command, this wouldn't be needed if the plugin gave the additional config control needed. Also #504 is still an issue and should be reopened.

Update the CMD variable to be the actual path to ffmpeg.

What it does:

#!/bin/bash
CMD="/home/user/bin/ffmpeg"
SNAPSHOT="FALSE"

while [[ $# > 1 ]]
do
key="$1"

if [[ "${key}" == "-frames:v" ]]; then
    SNAPSHOT="TRUE"
fi

case ${key} in
    -pix_fmt)
        CMD+=" $1 vaapi"
        shift
    ;;
    -filter:v)
        if [[ "${SNAPSHOT}" == "FALSE" ]]; then
            CMD+=" $1"
        else
            shift
        fi
    ;;
    *)
        CMD+=" $1"
    ;;
esac
shift
done

CMD="${CMD//scale=/scale_vaapi=}"
exec $CMD ${!#}

Plugin config: Update the videoProcessor value to be the path to your ffmpeg.sh script.

This seems to work with and without videoFilter, but I have left it in for now as it's used in all VAAPI examples I've seen.

"platform": "Camera-ffmpeg",
"videoProcessor": "/home/user/bin/ffmpeg.sh",
"cameras": [
    {
        "name": "Front Camera",
        "videoConfig": {
            "source": "-vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://axis-accc8e6d470a.local/axis-media/media.amp?streamprofile=Media?tcp",
            "stillImageSource": "-i http://axis-accc8e6d470a.local/axis-cgi/jpg/image.cgi?resolution=1280x720",
            "encoderOptions": "-bf 0",
            "vcodec": "h264_vaapi",
            "videoFilter": "format=nv12|vaapi,hwupload"
            "maxBitrate": 0,
            "packetSize": 940
        }
    }

Config notes:

View your live camera in the Home app, and then run intel_gpu_top on the host machine to confirm the GPU cores are being used. Success.

Sunoo commented 2 years ago

Interesting. I’ll have to look into this closer later, but I think I’d be willing to tweak this plugin so your wrapper isn’t needed.

Also, I could probably work to get vaapi support compiled into the ffmpeg-for-homebridge package, if it’s not there currently.

washcroft commented 2 years ago

Also, I could probably work to get vaapi support compiled into the ffmpeg-for-homebridge package, if it’s not there currently.

That would be awesome if possible, but I think there are licensing issues with distribution and ffmpeg-for-homebridge distributes static binaries.

Sunoo commented 2 years ago

Ah, it’s one of those things. I’ll see what I can find out.

Sunoo commented 2 years ago

Based on my quick looking, it sounds like it should be okay to include: https://trac.ffmpeg.org/wiki/HWAccelIntro#VAAPI

washcroft commented 2 years ago

Based on my quick looking, it sounds like it should be okay to include: https://trac.ffmpeg.org/wiki/HWAccelIntro#VAAPI

Great! That being included in ffmpeg-for-homebridge and any tweaks you can make to add additional plugin config flexibility to eliminate the need for the wrapper script would definitely help people looking to improve their HomeKit camera experiences, especially with HSV on the way.

Sunoo commented 2 years ago

The only trick is that I won’t be able to test myself, as I don’t have any Intel CPUs around, but I can mirror what your script does and it should work.

washcroft commented 2 years ago

The only trick is that I won’t be able to test myself, as I don’t have any Intel CPUs around, but I can mirror what your script does and it should work.

I can do the testing for you. I'm thinking config wise all we'd need is:

scaleFilterName - defaulted to "scale" pixelFormat - defaulted to "yuv420p" stillImageFilter - defaulted to whatever videoFilter is set to (same as today), and use this when doing snapshots rather than streaming

Sunoo commented 2 years ago

I have so many options that I was hoping to avoid adding more. I was considering handling this by switching those values out when the user sets vcodec to h264_vaapi, similar to how the plugin already handles the copy vcodec.

washcroft commented 2 years ago

I have so many options that I was hoping to avoid adding more. I was considering handling this by switching those values out when the user sets vcodec to h264_vaapi, similar to how the plugin already handles the copy vcodec.

Or that works :) I just thought these config values would allow total flexibility i.e. somebody might want to use a different hardware acceleration library.

Sunoo commented 2 years ago

There are a few that work already, h264_videotoolbox and h264_omx come to mind, but neither of those require any special settings.

washcroft commented 2 years ago

Yeah, I did wonder how omx was handled. The main thing is the filters, they're named differently in vaapi, maybe hardcode those and pixel format when using h264_vaapi.

I still think stillImageFilter as a config value would be useful, as the source could be entirely different. Support the none keyword like in videoFilter.

Sunoo commented 2 years ago

I’ll consider stillImageFilter, though I’m not sure I understand when that would really be needed? If it’s just for this (which it sounds like it works fine without it, so I’d want to look into it more), then I’d be better off triggering it with the vcodec setting.

washcroft commented 2 years ago

Hmm I don't actually have an issue with snapshots anymore when I disable that part of the wrapper script - as I don't need anything special in videoFilter.

I suppose an issue could arise if you had source set to your HW accelerated RTSP stream, stillImageSource set to something like a JPEG/MJPEG capture, but you needed some HW acceleration specific flags in the videoFilter for source to work properly. That videoFilter would make no sense to the still image capture. This is probably an edge case though, and can probably be left for another day, if it ever arose.

So all we need is when vcodec is set to h264_vaapi:

  1. Use -pix_fmt vaapi instead of -pix_fmt yuv420p
  2. Use scale_vaapi= instead of scale= in the filters (only in resInfo.videoFilter, not resInfo.resizeFilter).
github-actions[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

washcroft commented 2 years ago

@Sunoo Any chance we can reopen this? Confirm what if any changes you want and I'll include them.

Sunoo commented 2 years ago

Sorry, lost sight of this and the bot grabbed it. I’ll try to make these changes once HKSV is out.

washcroft commented 2 years ago

I'm not sure how I'm only just discovering this now, but it seems force_original_aspect_ratio isn't working all of the time either, probably something to do with ffmpeg build options/library versions:

[Parsed_scale_vaapi_0 @ 0x556072fe9dc0] [error] Option 'force_original_aspect_ratio' not found

So I'm now using this videoFilter to prevent any other filters from being automatically added, downside being it loses the ability to scale the output as per the configured or requested width/height.

"videoFilter": "none,format=nv12|vaapi,hwupload"

Or if scaling is needed, something like these:

"videoFilter": "none,format=nv12|vaapi,hwupload,scale_vaapi=iw/2:ih/2"

"videoFilter": "none,format=nv12|vaapi,hwupload,scale_vaapi=w=1920:h=1080"

I still need to use the wrapper script to change -pix_fmt yuv420p to -pix_fmt vaapi, and remove -filter:v entirely when a snapshot is being taken.

marvelloard commented 2 years ago

It would be extremely helpful if vaapi was directly available from the plugin. Any chances it will be integrated directly?

marvelloard commented 1 year ago

Synology DS918+, ffmpeg5 package: Impossible to convert between the formats supported by the filter 'Parsed_hwupload_1' and the filter 'auto_scale_0'

Can somebody help me please?

VCTGomes commented 1 year ago

Thanks to @mickgiles for the inspiration in #508, I wanted to share what worked for me in newer versions (v3) of homebridge-camera-ffmpeg. If running in docker, remember to pass through your /dev/dri:/dev/dri devices.

Obviously you'll need a working copy of ffmpeg that has been built with VAAPI (h264_vaapi) support. Lots of resources for compiling your own out there.

ffmpeg.sh Script The script acts a middle man between homebridge and ffmpeg. This is needed to fix a few hardcoded elements in the ffmpeg command, this wouldn't be needed if the plugin gave the additional config control needed. Also #504 is still an issue and should be reopened.

Update the CMD variable to be the actual path to ffmpeg.

What it does:

#!/bin/bash
CMD="/home/user/bin/ffmpeg"
SNAPSHOT="FALSE"

while [[ $# > 1 ]]
do
key="$1"

if [[ "${key}" == "-frames:v" ]]; then
    SNAPSHOT="TRUE"
fi

case ${key} in
    -pix_fmt)
        CMD+=" $1 vaapi"
        shift
    ;;
    -filter:v)
        if [[ "${SNAPSHOT}" == "FALSE" ]]; then
            CMD+=" $1"
        else
            shift
        fi
    ;;
    *)
        CMD+=" $1"
    ;;
esac
shift
done

CMD="${CMD//scale=/scale_vaapi=}"
exec $CMD ${!#}

Plugin config: Update the videoProcessor value to be the path to your ffmpeg.sh script.

This seems to work with and without videoFilter, but I have left it in for now as it's used in all VAAPI examples I've seen.

"platform": "Camera-ffmpeg",
"videoProcessor": "/home/user/bin/ffmpeg.sh",
"cameras": [
    {
        "name": "Front Camera",
        "videoConfig": {
            "source": "-vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://axis-accc8e6d470a.local/axis-media/media.amp?streamprofile=Media?tcp",
            "stillImageSource": "-i http://axis-accc8e6d470a.local/axis-cgi/jpg/image.cgi?resolution=1280x720",
            "encoderOptions": "-bf 0",
            "vcodec": "h264_vaapi",
            "videoFilter": "format=nv12|vaapi,hwupload"
            "maxBitrate": 0,
            "packetSize": 940
        }
    }

Config notes:

  • maxBitrate and packetSize are not essential, but gave me much higher quality streams
  • encoderOptions set to -bf 0 is the same as using the traditional bframes=0, without this the frame rate is very poor
  • If the H264 profile of your camera is baseline (normally the case with lower power, cheaper devices), you might need to add the following parameter into the source value to have ffmpeg/vaapi ignore the profile and try to process the stream anyway: -hwaccel_flags allow_profile_mismatch

View your live camera in the Home app, and then run intel_gpu_top on the host machine to confirm the GPU cores are being used. Success.

Hey, is it possible to create same script to fix integration with Camera UI too? It works well on Ffmpeg, but not on CameraUI.

Camera UI Ffmpeg command looks like: I dunno where it's being broken

[13/08/2023, 01:24:49] [CameraUI] Campainha: Stream command: /usr/transcode/ffmpeg_convert.sh -hide_banner -loglevel verbose -vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://admin:xxxxx@192.168.2.99:554/Streaming/channels/101 -an -sn -dn -r 30 -vcodec h264_vaapi -pix_fmt yuv420p -color_range mpeg -f rawvideo -preset ultrafast -tune zerolatency -filter:v scale='min(1280,iw)':'min(720,ih)':force_original_aspect_ratio=decrease,scale=trunc(iw/2)2:trunc(ih/2)2 -b:v 299k -bufsize 598k -maxrate 299k -payload_type 99 -ssrc 15007668 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params +9eF1r951vfENlO4zJMcFyDgiyZMXABZ1SnxPHiq srtp://192.168.2.154:60989?rtcpport=60989&pkt_size=1318 -vn -sn -dn -acodec libfdk_aac -profile:a aac_eld -flags +global_header -f null -ar 16k -b:a 24k -ac 1 -payload_type 110 -ssrc 14493720 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params HzoXAzE3kXcNiUQx2J6WrLzbAzYixjPGFmDC2mh1 srtp://192.168.2.154:59186?rtcpport=59186&pkt_size=188 -progress pipe:1

mickgiles commented 1 year ago

In the camera-ui configuration, under Options, you can configure the Video Processor. In there you could just put the script above that points to your custom built ffmpeg. I now run on a different platform so can't really try this but it looks like it should work.

VCTGomes commented 1 year ago

I

In the camera-ui configuration, under Options, you can configure the Video Processor. In there you could just put the script above that points to your custom built ffmpeg. I now run on a different platform so can't really try this but it looks like it should work.

I tried it, unfortunately it doesn't work. Same script works with homebridge ffmpeg.

That's what I got:

Campainha: FFmpeg videoanalysis process exited with error! (null) - Impossible to convert between the formats supported by the filter 'Parsed_fps_0' and the filter 'auto_scale_0' - [vf#0:0 @ 0x5604e30278c0] Error reinitializing filters! Failed to inject frame into filter network: Function not implemented Error while filtering: Function not implemented [out#0/image2pipe @ 0x5604e319e980] Nothing was written into output file, because at least one of its streams received no packets.

marvelloard commented 1 year ago

Thanks to @mickgiles for the inspiration in #508, I wanted to share what worked for me in newer versions (v3) of homebridge-camera-ffmpeg. If running in docker, remember to pass through your /dev/dri:/dev/dri devices. Obviously you'll need a working copy of ffmpeg that has been built with VAAPI (h264_vaapi) support. Lots of resources for compiling your own out there. ffmpeg.sh Script The script acts a middle man between homebridge and ffmpeg. This is needed to fix a few hardcoded elements in the ffmpeg command, this wouldn't be needed if the plugin gave the additional config control needed. Also #504 is still an issue and should be reopened. Update the CMD variable to be the actual path to ffmpeg. What it does:

#!/bin/bash
CMD="/home/user/bin/ffmpeg"
SNAPSHOT="FALSE"

while [[ $# > 1 ]]
do
key="$1"

if [[ "${key}" == "-frames:v" ]]; then
    SNAPSHOT="TRUE"
fi

case ${key} in
    -pix_fmt)
        CMD+=" $1 vaapi"
        shift
    ;;
    -filter:v)
        if [[ "${SNAPSHOT}" == "FALSE" ]]; then
            CMD+=" $1"
        else
            shift
        fi
    ;;
    *)
        CMD+=" $1"
    ;;
esac
shift
done

CMD="${CMD//scale=/scale_vaapi=}"
exec $CMD ${!#}

Plugin config: Update the videoProcessor value to be the path to your ffmpeg.sh script. This seems to work with and without videoFilter, but I have left it in for now as it's used in all VAAPI examples I've seen.

"platform": "Camera-ffmpeg",
"videoProcessor": "/home/user/bin/ffmpeg.sh",
"cameras": [
    {
        "name": "Front Camera",
        "videoConfig": {
            "source": "-vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://axis-accc8e6d470a.local/axis-media/media.amp?streamprofile=Media?tcp",
            "stillImageSource": "-i http://axis-accc8e6d470a.local/axis-cgi/jpg/image.cgi?resolution=1280x720",
            "encoderOptions": "-bf 0",
            "vcodec": "h264_vaapi",
            "videoFilter": "format=nv12|vaapi,hwupload"
            "maxBitrate": 0,
            "packetSize": 940
        }
    }

Config notes:

  • maxBitrate and packetSize are not essential, but gave me much higher quality streams
  • encoderOptions set to -bf 0 is the same as using the traditional bframes=0, without this the frame rate is very poor
  • If the H264 profile of your camera is baseline (normally the case with lower power, cheaper devices), you might need to add the following parameter into the source value to have ffmpeg/vaapi ignore the profile and try to process the stream anyway: -hwaccel_flags allow_profile_mismatch

View your live camera in the Home app, and then run intel_gpu_top on the host machine to confirm the GPU cores are being used. Success.

Hey, is it possible to create same script to fix integration with Camera UI too? It works well on Ffmpeg, but not on CameraUI.

Camera UI Ffmpeg command looks like: I dunno where it's being broken

[13/08/2023, 01:24:49] [CameraUI] Campainha: Stream command: /usr/transcode/ffmpeg_convert.sh -hide_banner -loglevel verbose -vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://admin:xxxxx@192.168.2.99:554/Streaming/channels/101 -an -sn -dn -r 30 -vcodec h264_vaapi -pix_fmt yuv420p -color_range mpeg -f rawvideo -preset ultrafast -tune zerolatency -filter:v scale='min(1280,iw)':'min(720,ih)':force_original_aspect_ratio=decrease,scale=trunc(iw/2)2:trunc(ih/2)2 -b:v 299k -bufsize 598k -maxrate 299k -payload_type 99 -ssrc 15007668 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params +9eF1r951vfENlO4zJMcFyDgiyZMXABZ1SnxPHiq srtp://192.168.2.154:60989?rtcpport=60989&pkt_size=1318 -vn -sn -dn -acodec libfdk_aac -profile:a aac_eld -flags +global_header -f null -ar 16k -b:a 24k -ac 1 -payload_type 110 -ssrc 14493720 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params HzoXAzE3kXcNiUQx2J6WrLzbAzYixjPGFmDC2mh1 srtp://192.168.2.154:59186?rtcpport=59186&pkt_size=188 -progress pipe:1

Finally got HW working thanks to this post! (Synology package, not Docker)