xbmc / inputstream.adaptive

kodi inputstream addon for several manifest types
Other
453 stars 242 forks source link

Halfway decent buffering concept for livestreams (e.g. HLS/Dash) #726

Closed aedicted closed 3 years ago

aedicted commented 3 years ago

I might be missing something here and maybe everything has been entirely thought through and things are the way they are supposed to, but somehow the silly implementation of virtually any player is staggering me when it comes to proper buffering.

For instance, out of concrete practical needs these days during the soccer EM 2020/2021 - Kodi in conjunction with the ORF tvthek addon, which seems to make use of that very inputstream adaptive plugin, it is just awful.

Given the premise of a suboptimal, non-constant data throughout (otherwise also terrible implementations won't look too bad, which is part of my critic), be it due to the user's own internet access, interconnection between ISPs, a VPN/proxy/other tunnel congestion or the streaming provider itself, I would expect the following:

The player won't play anything at first but keep filling it's buffer which may be defined by the user in the best case without any xml/file editing acrobatics. This maximum amount naturally is limited by the streaming provider as at some point, certain segments will simply expire and not be available anymore so data loss by principle is unavoidable. Taken that aside for a moment, I would further assume that the player will then start playing the content while keep on requesting new parts as fast as possible to refill the buffer. Extending that to HLS with separate segments, the parallel request of those would be a clever thing as well, given the observation that a download of only one often turns out to be rather slow thanks to the TCP window fuss.

Instead I encounter the extremely frustrating and nutty behaviour as follows:

  1. Any fiddling in that advancedsettings.xml for Kodi doesn't seem to have any effect for HLS/Dash streaming stuff whatsoever. So much for option 1 (buffer ANY source).

  2. While Kodi seems to halfway decently buffer the input stream at the start, the total amount looks very little like those 8 second window or so which have been mentioned here and there around ... Once that already way too stingy buffer runs empty (which especially on mobile connections like LTE occurs at a high probability), the player doesn't wait until at least that is full again, but no, it attempts to resume playback as soon as only a part of that small buffer is regained, unsurprisingly almost asking for the next buffer-underrun and stutter of course.

The oddity becomes apparent when one manually pauses the playback in such buffer-underrun event and starts only when at least the "stingy buffer" is entirely full again. Et voila, the playback is at least not pausing right away again but instead keeps on playing quite a while even if the total input rate is slightly below the normally demanded one. So while in such cases flawless playback would only be possible by really using a huge buffer and delay, even when assuming that repeated buffer-underruns are unavoidable due to the too slow throughput, it is beyond my comprehension why almost every player implements the buffering "strategy" in such a lousy and annoying way.

I think you as a developer at some point replied that a real decent buffering strategy would be very complex to implement, like dealing with different steam qualities, which segments to request, how to switch, etc. but how difficult can it be to at least improve the buffering method with a given stream comparable to what anyone apparently can manually due by pausing.

I would already be satisfied if Kodi or your plugin (whichever is effectively doing the buffering) would allow to max out the buffer up to the streaming operator's server limit and wait until that buffer is completely full again before resuming playback for one given stream.

peak3d commented 3 years ago

There is a branch "rework" which addresses the buffer issues. There are some minor things which have to be fixed, and it must be tested heavily, but implementation is nearly ready.

aedicted commented 3 years ago

Thanks a lot for your quick reply and promising information, peak3d!

Could you maybe elaborate on the changes and improvements?

I would also be interested in knowing what the practical limits of buffer sizes with HLS/DASH streams are from your experience as clearly, the expiration of segments on the server side is ultimately the limiting factor to my understanding.

Also, I would be eager to know why really most player implementations are so bad in general when it comes to buffering. It is hard to believe that it is just ignorance or lack of knowledge, maybe there is something like the philosophy such as reducing the latency time as much as possible or saving space on the server side and thus designing the buffers so small. What is your opinion on that?

If possible, I'd be willing to test any beta of your plugin of course and provide feedback.

glennguy commented 3 years ago

@aedicted The GSOC 2020 student worked closely with peak3d to get us nearly all of the way there last year, unfortunately time constraints got in the way but sometimes things are just this way.

As peak3d said a lot of work is in the rework branch and as well as #506. There are some issues such as multi-period files, and lots of testing will need to happen. Also it will start life in Kodi 20, and possibly be backported to Matrix once stable.

I plan to take this up soon, firstly getting the branch rebased on our current work, and refactoring so some unit tests can be run.

Yes, currently the buffering situation is poor - we have an 8 second internal buffer to work with inside Kodi and that is all. If you have a fast connection and the server is nice then no problems, but there are many cases where servers have large amounts of lag even for low quality/bandwidth streams, please see existing issues #8 #14 #392

aedicted commented 3 years ago

Thanks for the elaboration. My thoughts exactly - the current implementation for sure works nicely when given a steady stream and sufficiently performing (content delivery) networks, but from my point of view, good software defines itself rather by being able to successfully deal with non-ideal situations, just like a good program is supposed not to crash when given invalid inputs for instance than working great under perfect conditions.

So with work here and there already happening, is there any concrete piece or version of Kodi I could give a try? Alternatively, is there any way to setup a proxy or PC environment acting as a VPN GW which pre-requests those MPEG/TS streams in advance? I rather doubt it as the TCP connection and thus requests are terminated at the player's side and with nowdays Widewine/Playready and other DRM shit, things are a nightmare for any decent playback handling anyway (streamlink won't be much of help anymore). Uargh, this is all so frustrating when things suck that way for no good technical reason while they could be so much better.

This has already been mentioned in other threads here and I think the same - while the attempt of you and peak3d to sort out many details such as multiple streams, different bitrates, smart selection algorithms, etc. is very noble and promising, taken into consideration that these thoughs were already made years ago with still no final usable result (no critic at all, just to state a fact), maybe in the meantime it would be way more valuable to please at least increase that tiny 8 second buffer for one stream only right now instead of designing the ultimate solution with the next years of no outcome.

matthuisman commented 3 years ago

Your free to write your own buffer ahead proxy or help with the code in inputstream adaptive or get faster / more stable internet.

That will help get yourself a solution faster :) Walls of text won't help unfortunately :(

aedicted commented 3 years ago

When some streaming providers such as the Austrian ORF or the Swiss one "Teleboy" still used the unencrypted HLS variant, I knew to help myself by using Streamlink. With others - let's euphemistically call them "rather unofficial" which offer MPEG/TS - streams, one can simply save those and recreate a local HLS stream by e.g. ffmpeg in conjunction with a HTTP server. Not exactly comfortable, but still better than constant buffer underruns. To make things even worse, virtually all now seem to have that DRM crap in place.

Since I am not a real programmer beyond some basic Python stuff, this is where it ends for me. However, I would be willing to team up and if I would be good at anything, then for sure it is to find bugs and something malfunctioning. :p

Unfortunately, getting a "better" internet access with less variance, although surely a factor, is only one part of the equation of many variables as one still has the streaming operator himself, due lots of licensing bullshit in most cases a proxy or VPN provider and several interconnection between the providers in the chain.

Hence, one always wants to have decent software, error tolerance and optionally big buffers whenever possible as other than the additional delay and memory usage, it will create a prophylaxis of against data rate shortages independently of their cause.

If "walls" of text don't help, ignorance obviously doesn't either as it is rather the very cause of issues in many implementations. Like someone famous knew to say: "you're holding it wrong!".

matthuisman commented 3 years ago

Then I'm afraid it's the waiting game for you :)

Less time responding to already known GitHub issues gives devs more time to code: :)

They all do it for free in their own spare time. Adding pressure does nothing but burn them out and often always leads to slower progress