Closed 671792320 closed 4 years ago
It seems that when I use it under homeassistant like this -vaapi_device / dev / dri / renderD128 -hwaccel vaapi -hwaccel_output_format vaapi My homeassistant cannot restart the docker container anymore
Hi
I have not experimented much with ffmpeg.
use these settings, and let me know if they work.
Source Stream: -rtsp_transport tcp -i rtsp://url_for_camera_stream Encoder: libx264
it maybe that the user account the process is running under, does not have access to certain hardware.
I use h264_omx and it runs pretty well.
actually h264_omx won't work if running on a NAS. try libx264 instead
实际上,如果在NAS上运行,则h264_omx无法正常工作。 尝试libx264代替
Mine is DS918 + installed by J1900 without h264_omx This is H264 supported by ffmpeg installed in my NAS om --arch = x86_64 --enable-vaapi --enable-libmfx libavutil 56. 31.100 / 56. 31.100 libavcodec 58. 54.100 / 58. 54.100 libavformat 58. 29.100 / 58. 29.100 libavdevice 58. 8.100 / 58. 8.100 libavfilter 7. 57.100 / 7. 57.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 5.100 / 5. 5.100 libswresample 3. 5.100 / 3. 5.100 libpostproc 55. 5.100 / 55. 5.100 DEV.LS h264 H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 (decoders: h264 h264_qsv) (encoders: libx264 libx264rgb h264_qsv h264_vaapi)
When I use libx264 or h264 when my phone is turned on for monitoring, the CPU load of the NAS is 50% or higher
ok...
fire up 'top' in terminal and capture the ffmpeg process - it will give you the exact argument being passed to ffmpeg, when viewing a stream in Homekit
then use that same argument in a terminal - ffmpeg should return with some errors. I haven't enabled much in the way of debugging the nodes currently
OK, do I use parameters that enable hardware decoding? Or is it the default parameter? Should I completely copy the parameters that homekit passes to ffmpeg?
configure the node, as you would like. Then once done. fire up top like this 'top -c'
then view the stream in Homewkit
top will capture the arguments. when you have got the exact command from top - run that same command in terminal and see what output ffmpeg gives you.
Complete command ffmpeg -vaapi_device / dev / dri / renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -i rtsp: // admin: 123@192.168.1.242: 8554 / live3.sdp -map 0: 0 -vcodec h264 -pix_fmt yuv420p -r 10 -f rawvideo -tune zerolatency -vf scale = 1280: 720 -b: v 299k -bufsize 299k -maxrate 299k -payload_type 99 -ssrc 6393353 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -1.2? crtv_crc1_cwc1c1xwc1c1xwc1c1c1c1c2c1c1c1c2c1c1c1c2 = 61774 & pkt_size = 1316
Prompt this directly in the terminal bash-5.0 # ffmpeg version 4.2.1 Copyright (c) 2000-2019 the FFmpeg developers built with gcc 9.2.0 (Alpine 9.2.0) configuration: --prefix = / usr --enable-avresample --enable-avfilter --enable-gnutls --enable-gpl --enable-libass --enable-libmp3lame --enable-libvorbis --enable-libvpx- enable-libxvid --enable-libx264 --enable-libx265 --enable-libtheora --enable-libv4l2 --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-libxcb- disable-stripping --disable-static --disable-librtmp --enable-vaapi --enable-vdpau --enable-libopus --disable-debug libavutil 56. 31.100 / 56. 31.100 libavcodec 58. 54.100 / 58. 54.100 libavformat 58. 29.100 / 58. 29.100 libavdevice 58. 8.100 / 58. 8.100 libavfilter 7. 57.100 / 7. 57.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 5.100 / 5. 5.100 libswresample 3. 5.100 / 3. 5.100 libpostproc 55. 5.100 / 55. 5.100
[3] + Done localrtcpport = 61774 Seems no problem.
and I guessing no stream is getting to the phone?
No output on the phone
ok..
edit the command (and run in terminal) and instead of sending out to the phone, save it to file
ffmpeg -vaapi_device / dev / dri / renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -i rtsp: // admin: 123@192.168.1.242: 8554 / live3.sdp -map 0: 0 -vcodec h264 -pix_fmt yuv420p -r 10 -f rawvideo -tune zerolatency -vf scale = 1280: 720 -b: v 299k -bufsize 299k -maxrate 299k video.mp4
if that still does not produce a viewable file (you will need to ctrl+c it) then we have pinpointed it down to ffmpeg.
OK, I'll test it. If there is still no output then ffmpeg is not working, is it?
that's from ffmpeg?
No, just because of the url space problem, I am now restarting nas
Seems to be a problem with the docker container not recognizing the hardware [AVHWDeviceContext @ 0x7f77c4ae8380] No VA display found for device / dev / dri / renderD128. Device creation failed: -22. Failed to set value '/ dev / dri / renderD128' for option 'vaapi_device': Invalid argument Error parsing global options: Invalid argument
There you go! I'm no expert at ffmpeg I'm afraid. it may Evan be docker not being able to access or see the hardware - again I'm not an expert in docker either (.Net and NodeJS are my thing)
its defintetly a hardwire config/access problem as software rendering (libx264 is fine) apart from the high CPU usage to course.
hardware rendering on a pi (using h264_omx) does work really well however, not sure what else I can say here - sorry.
Thanks for your help, I try to add hardware mapping when starting docker. In the end, I will give you feedback if it has any effect. Thank you
thanks.
let me know how it goes.
You're welcome
It seems normal now. You need docker run --name nodered --device = / dev / dri: / dev / dri nodered / node-red when starting docker The purpose of this is to map the graphics card and then install it under docker ffmpeg and libva-intel-driver, so you can properly drive vaapi for hard solution. If docker prompts no permission, please connect docker image with root privileges docker exec -it -u root xxx / bin / bash Then you can use apk add ffmpeg and libva-intel-driver Thank you for your answer, thank you
And this also now works in node red using the camera node?
I'm not testing under nodered right now because it was built using a temporary environment and cannot be tested under nodered. I'll wait to re-establish nodered for testing. Then I will tell you. xie
After trying, it won't start if using vaapi. Thank you
Ok - Lets take a step back, and do this a different way.
Setup node as required (Don't be bias for this debugging)
Find the prototype : Camera.prototype.handleStreamRequest
in camera.js
In here, add the ability to capture stderr
from the ffmpeg
process.
Under
const ffmpeg = spawn("ffmpeg", fcmd.split(' '), { env: process.env, })
Add
ffmpeg.stdout.on('data', function(data)
{
console.log(data)
})
ffmpeg.stderr.on('data', function(data)
{
console.log(data)
})
Then run node-red, but in a terminal. attempt to view stream on iOS device
The terminal will (Should) report any error output from ffmpeg
When I use the parameters that I can use before, after adding the code according to your prompt, turning on the camera, I get a string in the background. Then the homekit camera eventually stopped and there was no image output. In addition, I checked the Wiki and learned that my central processor J1900 is using a Z3700 series graphics card, GEN7 generation, VAAPI does not support decoding, but supports encoding. I read these on the website introduced by vaapi This is a link to the content http://trac.ffmpeg.org/wiki/Hardware/VAAPI Encoding The encoders only accept input as VAAPI surfaces. If the input is in normal memory, it will need to be uploaded before giving the frames to the encoder-in the ffmpeg utility, the hwupload filter can be used for this. It will upload to a surface with the same layout as the software frame, so it may be necessary to add a format filter immediately before to get the input into the right format (hardware generally wants the nv12 layout, but most software functions use the yuv420p layout). The hwupload filter also requires a device to upload to, which needs to be defined before the filter graph is created.
So, to use the default decoder for some input, then upload frames to VAAPI and encode with H.264 and default settings:
ffmpeg -vaapi_device / dev / dri / renderD128 -i input.mp4 -vf 'format = nv12, hwupload' -c: v h264_vaapi output.mp4 If the input is known to be hardware-decodable, then we can use the hwaccel:
ffmpeg -hwaccel vaapi -hwaccel_output_format vaapi -hwaccel_device / dev / dri / renderD128 -i input.mp4 -c: v h264_vaapi output.mp4 Finally, when the input may or may not be hardware decodable we can do:
ffmpeg -init_hw_device vaapi = foo: / dev / dri / renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -hwaccel_device foo -i input.mp4 -filter_hw_device foo -vf 'format = nv12 | vaapi, hwupload' -c: v h264_vaapi This works because the decoder will output either vaapi surfaces (if the hwaccel is usable) or software frames (if it isn't). In the first case, it matches the vaapi format and hwupload does nothing (it passes through hardware frames unchanged) . In the second case, it matches the nv12 format and converts whatever the input is to that, then uploads. Performance will likely vary by a large amount depending which path is chosen, though.
The supported encoders are:
H.262 / MPEG-2 part 2 mpeg2_vaapi H.264 / MPEG-4 part 10 (AVC) h264_vaapi H.265 / MPEG-H part 2 (HEVC) hevc_vaapi MJPEG / JPEG mjpeg_vaapi VP8 vp8_vaapi VP9 vp9_vaapi For an explanation of codec options, see http://www.ffmpeg.org/ffmpeg-codecs.html#VAAPI-encoders.
Correct it, there is no problem with the default parameters, everything is normal. Only I changed -vcodec to h264_vaapi, ffmpeg does not start, and there is no error message in the terminal.
Sorry change
console.log(data)
to
console.log(data.toString())
and run it again (viewing a stream on your iOS device) (run node-red in terminal)
https://wiki.archlinux.org/index.php/Hardware_video_acceleration#Comparison_tables Can you help me confirm that my J1900 processor supports VAAPI hardware encoding? Or QSV encoding
Can you see if this can be solved?
Ok,
can you send me your 'Source Stream' parameter.
is this one? bash-5.0# ffmpeg -rtsp_transport tcp -i rtsp://admin:123@192.168.1.247:8554/live0.sdp ffmpeg version 4.2.1 Copyright (c) 2000-2019 the FFmpeg developers built with gcc 9.2.0 (Alpine 9.2.0) configuration: --prefix=/usr --enable-avresample --enable-avfilter --enable-gnutls --enable-gpl --enable-libass --enable-libmp3lame --enable-libvorbis --enable-libvpx --enable-libxvid --enable-libx264 --enable-libx265 --enable-libtheora --enable-libv4l2 --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-libxcb --disable-stripping --disable-static --disable-librtmp --enable-vaapi --enable-vdpau --enable-libopus --disable-debug libavutil 56. 31.100 / 56. 31.100 libavcodec 58. 54.100 / 58. 54.100 libavformat 58. 29.100 / 58. 29.100 libavdevice 58. 8.100 / 58. 8.100 libavfilter 7. 57.100 / 7. 57.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 5.100 / 5. 5.100 libswresample 3. 5.100 / 3. 5.100 libpostproc 55. 5.100 / 55. 5.100 Input #0, rtsp, from 'rtsp://admin:123@192.168.1.247:8554/live0.sdp': Metadata: title : Session Streamed by LIBZRTSP comment : live0.sdp Duration: N/A, start: 0.085000, bitrate: N/A Stream #0:0: Video: h264 (Main), yuvj420p(pc, bt709, progressive), 1920x1080, 23.42 tbr, 90k tbn, 180k tbc At least one output file must be specified bash-5.0# My current network environment is not stable, and my visit to github.com may be slow.
Or this one
that's the one.
I am on a Mac, so going to to configure it myself, and see what I get
Bear with me....
Thank you. Because I am not very proficient in compilation, I also tried to compile ffmpeg to open Intel QSV and vaapi, but it failed. The static compiled version of ffmpeg file I found on the network does not support the above two protocols
your nas is x86, as I am on a Mac, I should be bale to reproduce what's going on,
Ok, the static build for OSX doesn't include vaapi
Unknown encoder 'h264_vaapi'
I don't have the means to compile currently, so not sure how much more I can offer.
OK, thank you for your help
Question,
does node-red run with root privileges? during this conversation you have had it working. - what has changed from the below?
It seems normal now. You need docker run --name nodered --device = / dev / dri: / dev / dri nodered / node-red when starting docker The purpose of this is to map the graphics card and then install it under docker ffmpeg and libva-intel-driver, so you can properly drive vaapi for hard solution. If docker prompts no permission, please connect docker image with root privileges docker exec -it -u root xxx / bin / bash Then you can use apk add ffmpeg and libva-intel-driver Thank you for your answer, thank you
已经给docker已dsm的高权限去运行、并且映射了显卡硬件、 我想我已经知道了问题的所在了。 ffmpeg -rtsp_transport tcp -i rtsp://admin:123@192.168.1.472:8554/live3.sdp -map 0:0 -vcodec h264_vaapi -pix_fmt yuv420p -r 10 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 299k -bufsize 299k -maxrate 299k -payload_type 99 -ssrc 6571014 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params Lt+aaxTM650UQslZMLwOb/bciq3DuSBCIE98A5AS srtp://192.168.1.243:63686?rtcpport=63686&localrtcpport=63686&pkt_size=1316 这段命令是homekit工作是否的状态、 我在用你给的调试文件测试的时候发现。 我这样运行会出现提示vppi无法解码、我按照Wiki指示将-pix_fmt yuv420p 修改为-pix_fmt nv12,hwupload同样会报错、因为这个是先加载文件然后使用h264进行转码为yuv420格式后在发串流到homekit。但是我并没有在wiki上找到相关的内容示例、或许这样行不通。您的树莓派h264_omx似乎直接支持这样的格式、但是在vaapi上不行,并且提示vaapi_vld,但是我查询ffmpeg似乎并没有这样选项。也就是VAAPI不支持这样的格式。 遗憾的是我没有办法是用intel qsv进行测试。因为docker是基于alpine制作的nodered-我安装的ffmpeg并不支持qsv The docker has been given high permission to run DSM, and mapped the hardware of the graphics card
I think I already know the problem.
ffmpeg -rtsp_transport tcp -i rtsp://admin:123@192.168.1.472:8554/live3.sdp -map 0:0 -vcodec h264_vaapi -pix_fmt yuv420p -r 10 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 299k -bufsize 299k -maxrate 299k -payload_type 99 -ssrc 6571014 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params Lt+aaxTM650UQslZMLwOb/bciq3DuSBCIE98A5AS srtp://192.168.1.243:63686?rtcpport=63686&localrtcpport=63686&pkt_size=1316
This command is the status of whether homekit works
I found it when I tested it with the debugging file you gave me.
When I run like this, I will be prompted that VPPI cannot be decoded. I will change - pix ﹣ FMT yuv420p to - pix ﹣ FMT nv12 according to the instructions of Wiki. Hwupload will also report an error, because this is to load the file first, then transcode it to YUV420 format using h264, and then send it to homekit. But I didn't find relevant content samples on Wiki, which may not work. Your raspberry pie H264 ﹣ OMX seems to support this format directly, but not on vaapi, and prompts vaapi ﹣ VLD, but I don't seem to have this option when querying ffmpeg. That is, vaapi does not support this format.
This might be of interest. the camera node is based on homebridge-camera-ffmpeg
https://github.com/KhaosT/homebridge-camera-ffmpeg/issues/306
It seems that I'm not the first one to find out the second problem
It seems interesting
the -vf
parameter is used to to scale down the video (HomeKit asks for different resolutions) based on the device - you can try setting this manually (by editing the js file).
-vf "format=nv12,hwupload"
Is that right?
should be.
ok
you have missed of fps
after -r
give me a sec
Hello, please ask for help on the camera, when I try to use ffmpeg to hard decode the problem I query the official hardware decode for this command -vaapi_device / dev / dri / renderD128 -hwaccel vaapi -hwaccel_output_format vaapi When I add it like this and add monitoring rtsp connection, it seems to be unable to start, I am confused why this is not possible? But the same command can be started normally under homeassistant. In addition, I found that ffmpeg seems to support Intel QSV, but what I installed from apk add ffmpeg didn't seem to compile. I tried to recompile ffmpeg myself but it failed. My installation platform is nodered with a black skirt black DS918 + 6.2.2 docker installation. When I execute ls / dev / dri under nodered, docker correctly recognizes renderD128