Closed my-umd closed 1 year ago
I see a few things here that could be improved.
ffmpeg:
hwaccel_args: preset-vaapi
output_args:
record: preset-record-generic-audio-aac
frontdoor
but no camera in the go2rtc config named frontdoorit should be
go2rtc:
streams:
camera1:
- rtsp://xxx:xxx@192.168.222.333/live
- ffmpeg:camera1#audio=opus # <- needs to match the name of the camera for this config
webrtc:
candidates:
- 192.168.222.333:8555
- stun:8555
For audio, I tried both "aac" and "opus", but I cannot get audio from mse, and I cannot get webrtc to work at all (spin wheel forever).
please provide an ffprobe (can be retrieved from system page) with your cameras settings.
Also, I do not quite follow how I can use my go2rtc setting to replace my current camera setting (as present in the attached config).
Just follow the docs https://deploy-preview-4055--frigate-docs.netlify.app/configuration/restream#reduce-connections-to-camera
Looks using go2rtc is to reduce connections to cameras. But it will likely increase CPU usage? How does it use hardware acceleration? If it simply copies the stream (which I guess is CPU's job), looks hardware acceleration is not that useful here?
go2rtc is used to provide the mse / webrtc live views, reduce connections to the camera, and can be utilized for many other things as well. I have not seen CPU usage that high, but hardware acceleration is only used if you are transocding your stream (like h.265 -> h.264).
I have Frigate integrated with Home Assistant (HA). If I understand correctly, it uses rtmp to send over the stream to HA. I get rtmp is deprecated. If go2rtc incurs heavily on CPU and bring down the host, does it mean that there is no way to integrate with HA?
There were some reports in the early betas of issues with go2rtc performance but at this point I have not heard of it. For now RTMP can be used but it is deprecated and will be removed in a future version, it also uses some CPU as well.
Thanks @NickM-27. Good catch for the camera name mismatch. Apparently that was left unchanged when I made the trim down config. However, I still have no audio in mse (speaker icon is grayed out and cannot be toggled) and webrtc remains the same after matching up camera name and using ffmpeg presets as you suggested. Here is the ffprobe output:
[
{
"return_code": 0,
"stderr": "",
"stdout": {
"programs": [],
"streams": [
{
"avg_frame_rate": "15/1",
"codec_long_name": "H.264/AVC/MPEG-4AVC/MPEG-4part10",
"height": 1080,
"width": 1920
},
{
"avg_frame_rate": "0/0",
"bit_rate": "64000",
"codec_long_name": "PCMA-law/G.711A-law"
}
]
}
}
]
I'll read again the doc again once I get go2rtc straight out. And thanks for the info. about CPU usage regarding go2rtc.
MSE / recordings won't have audio since those require aac. Since your stream provides PCMA audio (which is compatible with webrtc) you should be transcoding to aac in go2rtc
go2rtc:
streams:
camera1:
- rtsp://xxx:xxx@192.168.222.333/live
- ffmpeg:camera1#audio=aac
webrtc:
candidates:
- 192.168.222.333:8555
- stun:8555
as far as webrtc issues go, you should check the browser logs and go2rtc logs to see why
MSE / recordings won't have audio since those require aac. Since your stream provides PCMA audio (which is compatible with webrtc) you should be transcoding to aac in go2rtc
Thanks. Changing it to 'aac' did give audio, but very broken up. Also, I see a spin wheel almost constantly showing over the camera feed frames. Seems somewhere is very slow and the live view keeps waiting for the feeds. Could be due to my camera. I know my camera is kind of crappy and hard to work with :) Anyway, I'll keep playing.
as far as webrtc issues go, you should check the browser logs and go2rtc logs to see why
Could you pls let me know where/how to check these logs? Logs from Frigate web UI is pretty dry. For "browser logs", do you mean the developer tools? If so, I checked and for some reason, the initiator is still JSMpegPlayer-9cefae39.js when I switch to webrtc. I expect it should change to something like WebrtcPlayer?
Check the go2rtc logs in frigate webUI, otherwise you can check chrome://media-internals for WebRTC player debugging
Thanks @NickM-27. Frigate go2rtc log only has these:
2023-03-30 23:56:22.032818903 [INFO] Preparing go2rtc config...
2023-03-30 23:56:23.208118908 [INFO] Starting go2rtc...
2023-03-30 23:56:23.422099673 19:56:23.421 INF go2rtc version 1.2.0 linux/amd64
2023-03-30 23:56:23.422443182 19:56:23.422 INF [api] listen addr=:1984
2023-03-30 23:56:23.424745406 19:56:23.424 INF [rtsp] listen addr=:8554
2023-03-30 23:56:23.425125537 19:56:23.425 INF [srtp] listen addr=:8443
2023-03-30 23:56:23.425398654 19:56:23.425 INF [webrtc] listen addr=:8555
2023-03-30 23:56:32.063455316 [INFO] Starting go2rtc healthcheck service...
Managed to get these from chrome://media-internals. But, to be honest, I do not have experience on this, and didn't see anything too obvious :) Player properties:
render_id: 30
player_id: 0
created: 2023-03-30 23:42:50.343 UTC
origin_url: http://192.168.xxx.xxx:xxxxx/
kFrameUrl: http://192.168.xxx.xxx:xxxxx/cameras/camera1
kFrameTitle: Frigate
url: blob:http://192.168.xxx.xxx:xxxxx/a35b567d-3152-42f3-91aa-a5b92c622d0f
kTextTracks:
info: Effective playback rate changed from 3 to 4
kRendererName: RendererImpl
pipeline_state: kStopped
kVideoTracks: [object Object]
kAudioTracks: [object Object]
duration: unknown
debug: (Log limit reached. Further similar entries may be suppressed): ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite.
kIsAudioDecryptingDemuxerStream: false
kAudioDecoderName: FFmpegAudioDecoder
kIsPlatformAudioDecoder: false
kIsVideoDecryptingDemuxerStream: false
kVideoDecoderName: VDAVideoDecoder
kIsPlatformVideoDecoder: true
dimensions: 1920x1080
kResolution: 1920x1080
event: kWebMediaPlayerDestroyed
pipeline_buffering_state: [object Object]
warning: Failed to reconcile encoded audio times with decoded output.
audio_buffering_state: [object Object]
Log:
{
"properties": {
"render_id": 30,
"player_id": 0,
"created": "2023-03-30 23:42:50.343 UTC",
"origin_url": "http://192.168.xxx.xxx:xxxxx/",
"kFrameUrl": "http://192.168.xxx.xxx:xxxxx/cameras/camera1",
"kFrameTitle": "Frigate",
"url": "blob:http://192.168.xxx.xxx:xxxxx/a35b567d-3152-42f3-91aa-a5b92c622d0f",
"kTextTracks": [],
"info": "Effective playback rate changed from 3 to 4",
"kRendererName": "RendererImpl",
"pipeline_state": "kStopped",
"kVideoTracks": [
{
"alpha mode": "is_opaque",
"codec": "h264",
"coded size": "1920x1080",
"color space": {
"matrix": "BT709",
"primaries": "BT709",
"range": "LIMITED",
"transfer": "BT709"
},
"encryption scheme": "Unencrypted",
"has extra data": false,
"hdr metadata": "unset",
"natural size": "1920x1080",
"orientation": "0°",
"profile": "h264 main",
"visible rect": "0,0 1920x1080"
}
],
"kAudioTracks": [
{
"bytes per channel": 2,
"bytes per frame": 2,
"channel layout": "MONO",
"channels": 1,
"codec": "aac",
"codec delay": 0,
"discard decoder delay": false,
"encryption scheme": "Unencrypted",
"has extra data": false,
"profile": "unknown",
"sample format": "Signed 16-bit",
"samples per second": 8000,
"seek preroll": "0us"
}
],
"duration": "unknown",
"debug": "(Log limit reached. Further similar entries may be suppressed): ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite.",
"kIsAudioDecryptingDemuxerStream": false,
"kAudioDecoderName": "FFmpegAudioDecoder",
"kIsPlatformAudioDecoder": false,
"kIsVideoDecryptingDemuxerStream": false,
"kVideoDecoderName": "VDAVideoDecoder",
"kIsPlatformVideoDecoder": true,
"dimensions": "1920x1080",
"kResolution": "1920x1080",
"event": "kWebMediaPlayerDestroyed",
"pipeline_buffering_state": {
"for_suspended_start": false,
"reason": "DEMUXER_UNDERFLOW",
"state": "BUFFERING_HAVE_NOTHING"
},
"warning": "Failed to reconcile encoded audio times with decoded output.",
"audio_buffering_state": {
"reason": "DEMUXER_UNDERFLOW",
"state": "BUFFERING_HAVE_NOTHING"
}
},
"events": [
{
"time": 0,
"key": "created",
"value": "2023-03-30 23:42:50.343 UTC"
},
{
"time": 2.2720000743865967,
"key": "origin_url",
"value": "http://192.168.xxx.xxx:xxxxx/"
},
{
"time": 2.3380000591278076,
"key": "kFrameUrl",
"value": "http://192.168.xxx.xxx:xxxxx/cameras/camera1"
},
{
"time": 2.3510000705718994,
"key": "kFrameTitle",
"value": "Frigate"
},
{
"time": 3.263000011444092,
"key": "url",
"value": "blob:http://192.168.xxx.xxx:xxxxx/a35b567d-3152-42f3-91aa-a5b92c622d0f"
},
{
"time": 3.36899995803833,
"key": "kTextTracks",
"value": []
},
{
"time": 3.703000068664551,
"key": "info",
"value": "ChunkDemuxer"
},
{
"time": 3.81000018119812,
"key": "kRendererName",
"value": "RendererImpl"
},
{
"time": 171.41200017929077,
"key": "pipeline_state",
"value": "kStarting"
},
{
"time": 939.5030000209808,
"key": "kVideoTracks",
"value": [
{
"alpha mode": "is_opaque",
"codec": "h264",
"coded size": "1920x1080",
"color space": {
"matrix": "BT709",
"primaries": "BT709",
"range": "LIMITED",
"transfer": "BT709"
},
"encryption scheme": "Unencrypted",
"has extra data": false,
"hdr metadata": "unset",
"natural size": "1920x1080",
"orientation": "0°",
"profile": "h264 main",
"visible rect": "0,0 1920x1080"
}
]
},
{
"time": 939.5859999656677,
"key": "kAudioTracks",
"value": [
{
"bytes per channel": 2,
"bytes per frame": 2,
"channel layout": "MONO",
"channels": 1,
"codec": "aac",
"codec delay": 0,
"discard decoder delay": false,
"encryption scheme": "Unencrypted",
"has extra data": false,
"profile": "unknown",
"sample format": "Signed 16-bit",
"samples per second": 8000,
"seek preroll": "0us"
}
]
},
{
"time": 939.8420000076294,
"key": "duration",
"value": "unknown"
},
{
"time": 1376.5210001468658,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1448.0600001811981,
"key": "kIsAudioDecryptingDemuxerStream",
"value": false
},
{
"time": 1448.0670001506805,
"key": "kAudioDecoderName",
"value": "FFmpegAudioDecoder"
},
{
"time": 1448.0700001716614,
"key": "kIsPlatformAudioDecoder",
"value": false
},
{
"time": 1448.1050000190735,
"key": "info",
"value": "Selected FFmpegAudioDecoder for audio decoding, config: codec: aac, profile: unknown, bytes_per_channel: 2, channel_layout: MONO, channels: 1, samples_per_second: 8000, sample_format: Signed 16-bit, bytes_per_frame: 2, seek_preroll: 0us, codec_delay: 0, has extra data: false, encryption scheme: Unencrypted, discard decoder delay: false, target_output_channel_layout: STEREO, target_output_sample_format: Unknown sample format, has aac extra data: true"
},
{
"time": 1448.2160000801086,
"key": "debug",
"value": "Video rendering in low delay mode."
},
{
"time": 1448.388000011444,
"key": "info",
"value": "Cannot select DecryptingVideoDecoder for video decoding"
},
{
"time": 1449.922000169754,
"key": "info",
"value": "Starting Initialization of DXVAVDA"
},
{
"time": 1469.2060000896454,
"key": "info",
"value": "Using D3D11 device for DXVA"
},
{
"time": 1471.989000082016,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1472.047000169754,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1480.78400015831,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1480.8519999980927,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1481.292000055313,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1481.3229999542236,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1482.0700001716614,
"key": "kIsVideoDecryptingDemuxerStream",
"value": false
},
{
"time": 1482.0759999752045,
"key": "kVideoDecoderName",
"value": "VDAVideoDecoder"
},
{
"time": 1482.0950000286102,
"key": "kIsPlatformVideoDecoder",
"value": true
},
{
"time": 1482.1490001678467,
"key": "info",
"value": "Selected VDAVideoDecoder for video decoding, config: codec: h264, profile: h264 main, level: not available, alpha_mode: is_opaque, coded size: [1920,1080], visible rect: [0,0,1920,1080], natural size: [1920,1080], has extra data: false, encryption scheme: Unencrypted, rotation: 0°, flipped: 0, color space: {primaries:BT709, transfer:BT709, matrix:BT709, range:LIMITED}"
},
{
"time": 1482.223000049591,
"key": "pipeline_state",
"value": "kPlaying"
},
{
"time": 1619.9720001220703,
"key": "dimensions",
"value": "1920x1080"
},
{
"time": 1620.005000114441,
"key": "kResolution",
"value": "1920x1080"
},
{
"time": 1671.6419999599457,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 1, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1674.630000114441,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1674.7090001106262,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1694.1760001182556,
"key": "info",
"value": "Effective playback rate changed from 0 to 1"
},
{
"time": 1695.026999950409,
"key": "event",
"value": "kPlay"
},
{
"time": 1700.3550000190735,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1700.4360001087189,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1701.542000055313,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1702.1059999465942,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1887.4120001792908,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1887.4690001010895,
"key": "debug",
"value": "Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1890.1040000915527,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1890.1730000972748,
"key": "debug",
"value": "(Log limit reached. Further similar entries may be suppressed): Media segment did not contain any coded frames for track 2, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media."
},
{
"time": 1895.1470000743866,
"key": "debug",
"value": "ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 1675.8700001239777,
"key": "pipeline_buffering_state",
"value": {
"for_suspended_start": false,
"state": "BUFFERING_HAVE_ENOUGH"
}
},
{
"time": 2091.1150000095367,
"key": "debug",
"value": "(Log limit reached. Further similar entries may be suppressed): ISO-BMFF container metadata for video frame indicates that the frame is a keyframe, but the video frame contents indicate the opposite."
},
{
"time": 2118.923000097275,
"key": "warning",
"value": "Failed to reconcile encoded audio times with decoded output."
},
{
"time": 2567.170000076294,
"key": "pipeline_buffering_state",
"value": {
"for_suspended_start": false,
"state": "BUFFERING_HAVE_ENOUGH"
}
},
{
"time": 4867.321000099182,
"key": "audio_buffering_state",
"value": {
"reason": "DEMUXER_UNDERFLOW",
"state": "BUFFERING_HAVE_NOTHING"
}
},
{
"time": 4867.715000152588,
"key": "pipeline_buffering_state",
"value": {
"for_suspended_start": false,
"reason": "DEMUXER_UNDERFLOW",
"state": "BUFFERING_HAVE_NOTHING"
}
},
{
"time": 6001.366000175476,
"key": "pipeline_buffering_state",
"value": {
"for_suspended_start": false,
"state": "BUFFERING_HAVE_ENOUGH"
}
},
{
"time": 9441.480000019073,
"key": "audio_buffering_state",
"value": {
"reason": "DEMUXER_UNDERFLOW",
"state": "BUFFERING_HAVE_NOTHING"
}
},
{
"time": 10122.315999984741,
"key": "info",
"value": "Effective playback rate changed from 1 to 2"
},
{
"time": 9441.773000001907,
"key": "pipeline_buffering_state",
"value": {
"for_suspended_start": false,
"reason": "DEMUXER_UNDERFLOW",
"state": "BUFFERING_HAVE_NOTHING"
}
},
{
"time": 11159.294000148773,
"key": "info",
"value": "Effective playback rate changed from 2 to 3"
},
{
"time": 12358.338999986649,
"key": "info",
"value": "Effective playback rate changed from 3 to 2"
},
{
"time": 11544.71900010109,
"key": "pipeline_buffering_state",
"value": {
"for_suspended_start": false,
"state": "BUFFERING_HAVE_ENOUGH"
}
},
{
"time": 13212.8140001297,
"key": "pipeline_buffering_state",
"value": {
"for_suspended_start": false,
"reason": "DEMUXER_UNDERFLOW",
"state": "BUFFERING_HAVE_NOTHING"
}
},
{
"time": 14147.154000043869,
"key": "info",
"value": "Effective playback rate changed from 2 to 3"
},
{
"time": 15180.278000116348,
"key": "info",
"value": "Effective playback rate changed from 3 to 4"
},
{
"time": 15468.411000013351,
"key": "event",
"value": "kPause"
},
{
"time": 25905.021000146866,
"key": "pipeline_state",
"value": "kStopping"
},
{
"time": 25907.84599995613,
"key": "event",
"value": "kWebMediaPlayerDestroyed"
},
{
"time": 25912.25600004196,
"key": "pipeline_state",
"value": "kStopped"
}
]
}
BTW, I am closing this because I realized that this might take a lot of time, and might be specific to me. I have the document to go over. Don't want to take too much time from the Devs.
Seeing some Failed to reconcile encoded audio times with decoded output.
which implies audio issues from the camera (like you mentioned) causing issues for webrtc as well. From what I can tell, seems to be a camera issue
Thanks @NickM-27. I'll keep playing and report if I get it to work. But if it's camera, I see no hope :)
Finally got webrtc to work. It turns out that I need to use the default webrtc ports (i.e., 8555 tcp and udp). When I tried earlier, I used a non-standard port 4xxxx. Is there a way to use a non-standard port for webrtc? E.g., when creating my docker container, I use port 12345 to forward to 8555 (tcp and udp). Do I need to set up port forwarding in my router? The document said that I only need to forward port if accessing from outside. Maybe that is for users who use standard 8555 port? If using non-standard port, I will have to do port forwarding no matter what?
BTW, here is my command to create the docker container:
docker run --name=frigate --shm-size=256m --restart=unless-stopped --env=TZ=America/New_York --env=LIBVA_DRIVER_NAME=i965 --env=FRIGATE_RTSP_PASSWORD=xxxxxx --volume=/share/Container/frigate/config:/config:rw --volume=/share/share_vol2/frigate/media:/media/frigate:rw --network=bridge --privileged --workdir=/opt/frigate -p 49350:1935 -p 49230:5000 -p 49231:8554 -p 49232:8555 -p 49232:8555/udp --label='com.qnap.qcs.network.mode=nat' --label='com.qnap.qcs.gpu=False' --memory="4g" --cpus="2" --detach=true -t ghcr.io/blakeblackshear/frigate:0.12.0
I am using NAT network mode. If I do want to use non-standard webrtc port (49232 in my command), do I need to set up port forwarding on my router to forward traffic from 49232 to 8555? I am somewhat confused, because I think the ports are managed by the docker host.
Regarding the slowness of MSE feed. I think it is due to the camera. My test was using a Wyze v2 camera with rtsp firmware. I have some v3 rtsp Wyze cameras as well. I do not see the same behavior. For some reason, MSE does not like rtsp Wyze v2 cameras (reproducible on my 2nd Wyze v2).
If you're going to use non standard port go2rtc config and your docker config need to reflect that new port
If you're going to use non standard port go2rtc config and your docker config need to reflect that new port
Thanks Nick. Could you pls elaborate a bit? My docker run command already has the port mapping. What should I do in the config file? I see webrtc can specify a port. I will see if that works.
FYI: changing port from 8555 to match the port in my sample docker run command (i.e., 49232) in config file webrtc candidates brings webrtc to live. Also, Wyze v2 is very smooth under webrtc. While it's sluggish under MSE. Wyze v3 is smooth under both MSE and webrtc. I do not understand why I don't need to make changes for other non-standard ports, e.g., port 8554. Maybe because they are all internal (to the docker)?
Describe the problem you are having
I have read the document but still cannot fully get it to work. I trimmed down my config to include only one camera. Please check the attached config.yml file. For audio, I tried both "aac" and "opus", but I cannot get audio from mse, and I cannot get webrtc to work at all (spin wheel forever). Also, I do not quite follow how I can use my go2rtc setting to replace my current camera setting (as present in the attached config).
Looks using go2rtc is to reduce connections to cameras. But it will likely increase CPU usage? How does it use hardware acceleration? If it simply copies the stream (which I guess is CPU's job), looks hardware acceleration is not that useful here?
I have Frigate integrated with Home Assistant (HA). If I understand correctly, it uses rtmp to send over the stream to HA. I get rtmp is deprecated. If go2rtc incurs heavily on CPU and bring down the host, does it mean that there is no way to integrate with HA?
Thanks for your help.
Version
0.12.0-7D589BD
Frigate config file
Relevant log output
Frigate stats
Operating system
Other Linux
Install method
Docker CLI
Coral version
Other
Any other information that may be helpful
Wyze cam v2 with rtsp firmware.