Closed hawkeye217 closed 1 year ago
It's definitely a known issue that it is slower to startup when there is only the logo showing, this is the case for h264 as well. I'd suggest looking in go2rtc config to see what the ffmpeg command is and then use that directly for your mjpeg camera so it is not dependent on two ffmpeg processes spinning up.
We are already planning on a birdseye revamp at some point in the future and this will be improved at that point.
This is the command I can run inside the container to watch the fps changes in realtime:
ffmpeg -hide_banner -hwaccel vaapi -hwaccel_output_format vaapi -hwaccel_device /dev/dri/renderD129 -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -f mp4 foo.mp4
The go2rtc config runs the birdseye command from frigate:
birdseye: exec:ffmpeg -hide_banner -hwaccel vaapi -hwaccel_output_format vaapi -hwaccel_device
/dev/dri/renderD129 -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i
/tmp/cache/birdseye -c:v h264_vaapi -g 50 -bf 0 -profile:v high -level:v 4.1 -sei:v
0 -an -vf format=vaapi|nv12,hwupload -rtsp_transport tcp -f rtsp {output}
birdseyemjpeg:
- ffmpeg:birdseye#video=mjpeg
Go2rtc reads from the birdseye pipe, and that's the issue - the pipe is only dripping 1fps when the logo is displayed, then 10fps when motion frames are written to the pipe.
So where would you suggest I run ffmpeg to generate the mjpeg stream? I might be mistaken, but go2rtc requires rtsp output from ffmpeg, which is why I just set up another go2rtc stream for mjpeg.
I'm suggesting instead of using ffmpeg:birdseye which means the birdseye stream has to start before your mjpeg stream can start, you create a manual ffmpeg command that outputs mjpeg directly reading from the birdseye pipe, skipping the step of the h264 stream needing to start first (and reducing load overall)
Right. I don't know how to make ffmpeg output mjpeg within rtsp - though I know it's possible because some Dahua cameras do mjpeg via rtsp as a substream.
Off to google for a bit... 😂
It should be the same just -c:v mjpeg_vaapi instead. Although vaapi may not support mjpeg output in which case you'd need to use software. You can also enable trace logs for exec on go2rtc and see the ffmpeg command in the logs
I'm definitely missing something.
go2rtc:
log:
level: debug
exec: trace
streams:
birdseyemjpeg:
- exec:ffmpeg -hide_banner -hwaccel vaapi -hwaccel_output_format vaapi -hwaccel_device /dev/dri/renderD129 -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -c:v mjpeg_vaapi -g 50 -bf 0 -profile:v high -level:v 4.1 -sei:v 0 -an -vf format=vaapi|nv12,hwupload -rtsp_transport tcp -f rtsp {output}
I get this:
Input #0, rawvideo, from '/tmp/cache/birdseye':
Duration: N/A, bitrate: 110592 kb/s
Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 1280x720, 110592 kb/s, 10 tbr, 10 tbn
At least one output file must be specified
Removing hwaccel and using this causes go2rtc to not even come up, nothing in the go2rtc log either (even with log level: debug):
streams:
birdseyemjpeg:
- "exec:ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -c:v mjpeg -r 10 -rtsp_transport tcp -f rtsp {output}"
Looks like Frigate is doing something strange with the { output }
parameter when it generates the /dev/shm/go2rtc.yaml
file.
If I remove it from the frigate config file, go2rtc starts normally (but obviously doesn't start the stream that uses it). I'm guessing this is probably a bug...
the docs already cover this, you need to put it as {{output}}
so it doesn't get caught by the template formatter
Ahhh. Sorry for not rtfm 🤦🏻♂️ Thanks.
Moved away from hwaccel since this intel machine doesn't do mjpeg vaapi. This seems like it starts to work:
exec:ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -c:v mjpeg -an -rtsp_transport tcp -f rtsp {{output}}
But the slow drip of the birdseye pipe is causing ffmpeg to time out. Back at square one.
2023-04-14 11:56:44.215423422 frame= 13 fps=1.1 q=8.2 size=N/A time=00:00:01.20 bitrate=N/A speed=0.104x
[rtp @ 0x55ee0a780740] RFC 2435 requires standard Huffman tables for jpeg
2023-04-14 11:56:45.242757695 frame= 14 fps=1.1 q=9.6 size=N/A time=00:00:01.30 bitrate=N/A speed=0.104x
[rtp @ 0x55ee0a780740] RFC 2435 requires standard Huffman tables for jpeg
2023-04-14 11:56:45.655404161 frame= 15 fps=1.1 q=11.2 size=N/A time=00:00:01.40 bitrate=N/A speed=0.103x
11:56:45.655 WRN github.com/AlexxIT/go2rtc/cmd/streams/producer.go:133 > error="read tcp 127.0.0.1:8554->127.0.0.1:47992: i/o timeout" url="exec:ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -c:v mjpeg -an -rtsp_transport tcp -f rtsp {output}"
2023-04-14 11:56:45.655430393 11:56:45.655 DBG [exec] run url="exec:ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -c:v mjpeg -an -rtsp_transport tcp -f rtsp rtsp://localhost:8554/1e3e5cb8a22e528f79a6797612d03923"
I wouldn't expect that to behave any differently than the other ffmpeg process since, and I don't see any timeout args which is odd. I'm sure there is a way to set that on the input though
Actually, partial progress - that Huffman tables line led me to add "-huffman 0" to the args and that seems to have solved the timeout.
Now, the pixel format is probably my issue as the mjpeg stream colors and size/layout is distorted.
Well, that's about as far as I'm able to get. It's an mjpeg stream, but I'm way out of my league with all of the colorspace, codec, and compression stuff 😂
This is my command in go2rtc:
birdseyemjpeg:
- exec:ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -c:v mjpeg -huffman 0 -an -rtsp_transport tcp -f rtsp {output}
Maybe @AlexxIT has ideas? Might try to open an issue with go2rtc.
Found the solution in Alex's go2rtc code. The working command is:
streams:
birdseyemjpeg:
- exec:ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -r 10 -i /tmp/cache/birdseye -c:v mjpeg -force_duplicated_matrix:v 1 -huffman:v 0 -pix_fmt:v yuvj420p -an -rtsp_transport tcp -f rtsp {{output}}
This gives me a super fast loading MJPEG stream that works perfectly.
@hawkeye217, trying to use your entry above to stream to my Roku TV.
I have the stream working perfectly when browsing to it via Chrome but when I enter my info in the IP Camera Viewer - Basic app, I'm just getting a black sceen. I'm wondering if I'm missing something.
I've configured the camera in the app as follows:
Name: Birdseye Description: IP: 192.168.1.14 TCP Port: 1984 Login: Password: Stream URL: /api/stream.mjpeg?src=birdseyemjpeg
You see anything obvious?
So the stream in your browser works with http://192.168.1.14:1984/api/stream.mjpeg?src=birdseyemjpeg
?
If I recall correctly, I used the frigate port (5000) and prepended the URL with /live/webrtc
. But I haven't tested it on the TV recently.
Describe the problem you are having
I have a Roku TV that I want to stream birdseye to, using the
motion
mode. Roku has a simple app available to display mjpeg streams from IP cameras (called IP Camera Viewer - Basic). I'm setting up an automation in HA that essentially will display the restreamed birdseye view on the TV from an Alexa command. I've set that up without a problem. I'm using an additional line in my go2rtc config to convert the birdseye stream to mjpeg:Once go2rtc brings up the stream (sometimes that takes a bit), it all works. I can use the mjpeg link from go2rtc and put that in the Roku app and I can see the birdseye stream on the TV.
The problem comes when the stream in go2rtc is inactive for a while. It takes some time for go2rtc to begin generating the rtsp stream, and then takes more time for the mjpeg stream to spin up. This causes the Roku app to timeout with an error.
I've noticed a faster spinup in go2rtc when birdseye is already streaming something (whether it be from motion or if it's set on continuous). Obviously when it's streaming something more than the birdseye logo, the framerate coming from the birdseye pipe jumps to 10fps (up from 1fps when there's no motion), so I wonder if this is contributing to my issue.
Maybe this could be fixed with some sort of ffmpeg arg or something in go2rtc to generate a more consistent framerate and thus avoid any delays/buffering in the spinup of the mjpeg stream for the Roku.
I realize my use case is unique, and with go2rtc now in Frigate, I might be able to just use that to craft a better solution rather than making changes to birdseye code.
Let me know if you have any thoughts or ideas!
Version
0.12.0-7d589bd
Frigate config file
Relevant log output
FFprobe output from your camera
Frigate stats
No response
Operating system
Other Linux
Install method
Docker Compose
Coral version
USB
Network connection
Wired
Camera make and model
Dahua
Any other information that may be helpful
No response