Closed kevkid closed 1 year ago
A few things here:
(I have built the docker image from the source code as I could not find a :latest tag and wanted to try out the latest go2rtc enabling webrtc input)
There is no need to do this, Frigate has dev images already built and available on the repo
Hi @NickM-27 Thank you for the response. So This is correct in 0.13 PCM audio works for both MSE and WebRTC as you stated in point 1. But I am not so sure about using PCM audio directly from the camera and then add that as an input to the go2rtc, that it will pass the audio to the camera -> recording. Here is an example of what my connection looked like:
rtsp://127.0.0.1:8557/back-yard-cam
I also tried:
ffmpeg:rtsp://127.0.0.1:8557/back-yard-cam#video=copy#audio=copy
and sure the live view would be fine, and even viewing the RTSP stream in vlc direct from the bridge gives audio (VLC confirms the audio track). Where as if we pull the RTSP stream from the retransmit it does not include an audio channel (according to VLC). I was perplexed by this, but I suspect it has to do with how it packages the restream (?) Not completely clear why it drops the audio channel (maybe something to do with the fact that mp4 cant do PCM? not sure). But I know for a fact that even though my output args looked like:
-f segment -segment_time 60 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c:v copy -c:a aac
There would be no audio track on the video.
Now my stream looks like:
go2rtc:
streams:
# back_yard: webrtc:http://<bridge-ip>:5000/signaling/back-yard-cam?kvs#format=wyze # <--This is the correct format for webrtc input as desired by the original post
back_yard: ffmpeg:rtsp://127.0.0.1:8557/back-yard-cam#video=copy#audio=aac
and my output args look like:
output_args:
record: -f segment -segment_time 60 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c:v copy -c:a copy
for 2, yes this is correct.
would look something like: ffmpeg:rtsp://127.0.0.1:8557/back-yard-cam#video=copy#audio=opus
for 3 you are also correct. The answer I got to was:
go2rtc:
streams:
# back_yard: webrtc:http://<bridge-ip>:5000/signaling/back-yard-cam?kvs#format=wyze
But Interestingly, I can avoid all of this transcoding if I could output to MKV rather than MP4 as mp4 limits the types of audio I can use (to aac). If I could just mux the stream into mkv directly and view it (even if without audio on the web interface). Is there a way to do this?
This is because frigate configured go2rtc to only return AAC audio by default. Simply add ?video&audio
to the end of the rtsp restream url and the audio will be included.
Is there a way to do this?
No, MP4 only is supported for a number of reasons.
Thank you for your help. Closing
Describe the problem you are having
What I am trying to achieve:
Get low latency, high quality live view with audio -> Saves to video with the same quality + audio.
Context:
I have several wyze cameras (2 v3 and 2 Pan v3). I want to use webrtc as a source to get the lowest latency. I am running docker-wyze-bridge to grab the streams from my camera. I can output via rtsp stream or webrtc stream. Originally, I used the rtsp firmware, but found out it was deprecated. So now I am using the wyze-bridge
Current Setup:
I currently am getting the rtsp stream from the bridge. It works fine, albeit some latency and some stuttering. Not the worst, but can be improved. I have to encode the audio to AAC in the bridge, this introduces a little bit of latency. In order to get a low latency live view I need either MSE or webrtc, the jsmpeg is slow and capped at 10fps with no audio. To get these to work I need to enable go2rtc and restream the input stream coming from the bridge (which is restreaming the input coming from the camera). I then use the restream output from go2rtc as the input to the camera. I do this for 4 cameras.
Thoughts:
I believe the fact that I have to restream/enable go2rtc is causing latency. I think that while I do get a lower latency, higher quality stream with audio compared jsmpeg there is room for improvement. I believe using the webrtc stream will improve the stability, latency, and quality of the feed. I would like to use webrtc as a source. Do we really have to restream to get access to MSE/webrtc view?
(I have built the docker image from the source code as I could not find a :latest tag and wanted to try out the latest go2rtc enabling webrtc input)
I tried setting the go2rtc stream to the following for back_side and you will see the output in the logs:
Version
0.13.0-7C629C1
Frigate config file
Relevant log output
Frigate stats
No response
Operating system
Other Linux
Install method
Docker Compose
Coral version
USB
Any other information that may be helpful
Using the bridge I can access webrtc streams directly in the browser with much less latency. I can even get audio from it if I encode it with opus. I get a url like this: http://
<wyze-bridge-ip>
:8889/backyard-side-cameraFrom my understanding, using the latest go2rtc, we can use webrtc as a stream input as shown by this pr: https://github.com/blakeblackshear/frigate/pull/7250 and this readme: https://github.com/AlexxIT/go2rtc#source-webrtc