Open AudreyBeard opened 1 year ago
The wiki's entry on concatenating with different codecs may be helpful, which points to the concatenate filter documentation
It actually looks like concatenation doesn't solve the problem at all, since it still requires that you know the number of files to concatenate beforehand
I'm currently reading the wiki entry for streaming
This is honestly too much for me right now and it's not bringing me joy to work on it. I'm gonna table it for now.
Overview
Currently, Compressure functions entirely offline. In an effort to work towards the Live Composition milestone, we want to implement a streaming server/client system such that the server-side can create a timeline dynamically, and the client receives and displays and/or saves the result until the server ends the video.
Additional Info
what are we doing now, and how is this proposal different?
Currently, the system does the following, in order:
[0, 1]
ffmpeg
's concat demuxerWe want the system to look more like this:
If this is in service of Live Composition, why do we keep the timeline function?
This ticket represents a step towards the final goal of live composition, but there are other tasks that need to be accomplished before we can do that. Live composition is not in scope for this ticket.
What have we tried already, and how did it go?
We've tried a few things, none of which have been successful yet. I chalk this up to incompetence more than anything else.
hacking the concat demuxer to make an infinite switchable live source, stream via
ffmpeg
This was really meant to be the lowest-hanging fruit that gets deprecated quickly, but Audrey got distracted and stopped working on it. This might be a good proof-of-concept, but it's probably not where we wanna end up. Check out the wiki for how this works (scroll down to "Changing playlist files on the fly"). The work can be found on branch stream-concat-hack, in stream.py#L15, in
stream.InfinitePlaylist
, with usage starting in main.py#L223. I've also heard from someone in the ffmpeg-devel IRC channel that theffmpeg
streaming functionality is pretty brittle and shouldn't be used for anything besides low-overhead development (recommending instead the development of a dedicated video server software)Using pipes to pass slices from the server to the client
This was also meant to be easier than live-streaming over a socket, but I could never get it to work. I think I deleted it actually... In any case, it's sufficient for single-machine execution, and may be simpler to implement, but doesn't enable live streaming over the internet or to other machines, which may be preferable, especially if we're trying to open this up to live composition on lower-power machines.
Proper streaming over sockets
This is probably the heaviest lift, but also probably the "most correct" solution. It would involve instantiating a socket pair (Unix sockets for single-machine, and TCP or UDP for multi-machine), then streaming the data over the socket like a proper streaming service. This is the ideal solution because it opens up many doors w.r.t. maintainability, scalability, portability, and live streaming over the internet. Without a doubt, this is the preferred implementation. Audrey couldn't get it past prototype because she was kinda burnt out when trying it, but she's down to give it another shot, especially if someone else has eyes on it (and is writing code for it). It's also on
stream-concat-hack
, in stream.py#L130 and stream.py#142