Open blakeblackshear opened 3 years ago
Yes. It is also way more bandwidth friendly.
I was poking around with ffmpeg and HLS yesterday, and it was simple to get a proof-of-concept going in Safari (which natively supports HLS). It'll need some JS for other browsers (hls.js maybe?), or possibly for all depending on playback controls.
Is this something that's being looked at, or would I be able to help by trying some things out? If so, any thoughts already given to this? Not sure I have the time to get a fully fledged PR going, though I don't mind trying.
Not sure where we are looking to use this, but the first questions everyone will have are:
"Why is there a delay?" A delay based on key frame rate will always exist for HLS by design. Nothing we can do here. "Can we get the motion/regions/bounding boxes on the HLS feed?" Nope. Never.
For chromecast support, it should be as simple as adding some nginx config:
rtmp {
server {
listen 1935; # Listen on standard RTMP port
chunk_size 4000;
application show {
live on;
# Turn on HLS
hls on;
hls_path /mnt/hls/;
hls_fragment 3;
hls_playlist_length 60;
# disable consuming the stream from nginx as rtmp
deny play all;
}
}
}
The downside is that will write to disk constantly, but we can probably point it to /tmp.
https://docs.peer5.com/guides/setting-up-hls-live-streaming-server-using-nginx/
The peer5 link doesn't mention if the nginx-module supports fragmented mp4 - apparently, that's a feature added to HLS to make it more compatible with MPEG-DASH, but also to support other codecs (HEVC for instance). That'd be nice to have.
I didn't want to duplicate somebody's work, but if nobody's actively looking at it, I'd like to see if if the fmp4-output from ffmpeg -f hls
is compatible (or can be made compatible) with the clips-feature. In that case, a basic HLS-stream could be added with no additional overhead, as a first step.
Re annotating the video, I suppose it might be possible to re-draw the boxes via JS on a canvas, although getting the timing right would probably be an interesting problem :-) This also requires not consuming the HLS-stream directly, but via the app. Could also use ffmpeg's overlay-feature and show a re-encoded stream, but that'd definitely require hardware acceleration.
There are other hairy bits here as well, like multiple bitrates (for cameras with multiple streams, those could be copied into the HLS output as well - haven't tried it yet, but the ffmpeg docs show how)
I have an example for the nginx config to serve up the HLS files generated with ffmpeg for chromecasts. I wouldn't bother with the fragmented stuff unless the chromecast supports it. HLS can be a more data efficient way to view feeds with a delay and the option to view an mjpeg live view at the expense of more data usage. I am working on jsmpeg as a replacement for mjpeg.
I'm not personally interested in Chromecast, and I'm not worried about a few seconds of delay - I was more interested in whether Frigate could be a complete NVR replacement, being able to show 6-9 cameras at a time, and rewind through the timeline :-) Basically, to replace my Synology Surveillance Station (which already adds 4-5 seconds delay when vieweing through it). Maybe also make sure it could help expose cameras as HomeKit cameras, with motion/doorbell through MQTT. HLS seemed like a necessary first step for that :-)
HLS won't help much with showing multiple cameras. You likely want what I am working on for that.
I think homeassistant can expose them as homekit cameras for you.
For rewinding through a timeline, see this issue.
True re 24/7 vs trusting that anything interesting is recorded. Also means a lot less storage needed, making it less relevant with HEVC storage I suppose.
I thought something like https://github.com/video-dev/hls.js/ would enable showing multiple cameras, but there'd still be the usual latency from this protocol of course. Definitely looking forward to seeing your jsmpeg work.
@blakeblackshear Have you looked at Shinobi CCTV ?
I have ran this for over a year and helped build the coral plugin. I am running upto 7 live cameras + coral image detection on its web interface with no performance issues ( without using HLS ). Perhaps you can gain insights with this project. Source Code
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
@blakeblackshear Have you looked at Shinobi CCTV ?
I have ran this for over a year and helped build the coral plugin. I am running upto 7 live cameras + coral image detection on its web interface with no performance issues ( without using HLS ). Perhaps you can gain insights with this project. Source Code
I agree with @Mitchross; there are certainly things to learn from Shinobi.
The HLS stream does point to /tmp for the same disk I/O reasons. Detection bounding boxes are actually overlays through websocket and not written to the HLS stream. This won't help certain client types and must also be sync'd with the key frame delays that HLS is bound by (an issue for me in Shinobi).
I am interested in this for the purpose of showing a public camera feed. Definitely fine if a few seconds behind, and probably a good thing it doesn't have detection boxes.
I used to use Shinobi to do this (had a whole API-key system for external access to feeds), but I found it to be much more a burden to set up and configure.
HLS is fully supported in go2rtc, I think all we would need to do here is potentially add an nginx entry in the frigate nginx config to have it part of frigate's api https://github.com/AlexxIT/go2rtc#module-hls
Just to mention, Chromecast natively supports playing through WebRTC. I have no clue how to tinker with that though.
I was thinking that an HLS endpoint would also be really useful for the web UI. It wouldn't necessarily have to replace the
AutoUpdatingImage
(that's used as a replacement for the MJPEG feed), but instead just piping the live view through HLS so you can display one or more camera feeds at a time, sort of a "security overview".