vbence / stream-m

An HTML5-compatible live streaming server supporting the WebM and H.264 formats.
MIT License
689 stars 141 forks source link

Support for HLS? #37

Open allegfede opened 6 years ago

allegfede commented 6 years ago

like someone read on the issues tread, i had success streaming pre-recorded video clips using stream-m (in webm and h264-flv) but still working on feeding the server with hardware h264/aac encoded live video.

Today i tested the idea to use HLS instead of rtpm, and seems it works.

This ffmpeg command: ffmpeg -y -re -f mpegts -i /dev/video0 -c:v copy -c:a copy -f hls test.m3u8

generates a bunch of test#.ts files and a plain text file (named test.m3u8) that summarize the ts franctions and the order to playback.

VLC plays ok all the clips and also the m3u8 ... so the question is ... Could be an option to add HLS to stream-m?

cheers.

Federico

vbence commented 6 years ago

Hi, live streaming HLS would involve muxing into TS (a container format which is completely missing from the codebase), also it would be necessary to dynamically generate and refresh .m3u8 file by the server (as the fragment-files will be generated on the fly).

So this would be a pretty large effort. On the bright side, really low latencies are achievable with HLS.

NewUserHa commented 3 years ago

isn't HLS high latency only(>a few seconds)? and RTMP is low latency

vbence commented 3 years ago

Hi, my requirements for low latency is that a frame should be playable when received (in a good case - with slicing - decoding can even start before the full frame has been downloaded). The biggest obstacle is the container format. RTMP is one of the good ones in this aspect, a single frame contains its own timing (and framing) next to it, and so is MPEG-TS (used in HLS). Other containers, like MKV (in some aspects) and MP4 (DASH) need to know the whole movie fragment (GOP) before muxing, because framing and timing info is not with the frame data, but in a its own chunk. Basically you receive all the metadata, then all the frame data. We usually consume HLS as a series of small-ish files, this is cheap because you can incorporate a classic CDN, but I don't think you absolutely have to use it like that. You can create a TS stream on-the-fly (and emulate a large .ts file), this way while the broadcaster is transmitting the frame you can start sending it to the viewer.

I have not tested how well today's browsers handle it, but I can't think of a good reason why they wouldn't.

NewUserHa commented 3 years ago

heard HLS has at least a few seconds of latency, because of generating the fragment and the m3u8. and RTMP can achieve <1s latency.

if it streams as a large MPEG-TS container, and it can be seen as just an HTTP stream. and shouldn't be called HLS. and the source uses RTMP to stream mostly, so the server needs to convert protocal from RTMP to HTTP. and then the latency might be about 1s. but not tested.

vbence commented 3 years ago

The m3u8 can be generated in advance, the files don't have to be there immediately, I don't know of a size limit that an .ts can be. You can call it HLS in a sense that any HLS-capable player should be able to play it.

NewUserHa commented 3 years ago

but if use m3u8 and fragment then there would be massive latency.

.ts has no size limit I guess. since the TV is using MPEG-TS

vbence commented 3 years ago

You can generate m3u8 without any knowledge of the stream, it can be static and just point to the stream's name. On the URL (pointed by the m3u8) the server can serve .ts as an endless file.

This way no additional latency is introduced expect a relatively small buffer on the server side (let's say 24k).