Closed flaviolaino closed 6 years ago
The problem appears to be that the duration is 75 seconds and you have only provided 74.99984 seconds of media (19 * 98684/25000).
Does changing the duration fix the problem?
Yes, changing the duration of original video fix the problem. Do you confirm that is a math issue? Can some kind of round() fix the problem?
Thanks
I don't think this is a player issue - there's no floating point issue or similar here, the numbers are correct. You just need to provide media for the entire duration of the presentation - either add an extra segment containing data for the remaining time or limit the duration to the media that is provided.
Furthermore, even the manifest does not pass validation. The content production workflow does not appear to be producing valid DASH, in multiple regards.
We using bitcodin service to encoding video, so we can't try any adjustment about encoder.
Thank you
I recommend you contact Bitcodin support then, to get their encoding service fixed. I am sure they would not want to force you to turn toward another provider who can produce valid DASH...
Standard compliance aside it could be something similar to this one in gpac: https://github.com/gpac/gpac/issues/360 - VOD ending on video segment border in conjunction with AAC audio is tricky because of the AAC padding (HLS segmenters also can have trouble with it); The gpac guys managed to solve it ;-)
I had a look at the manifest as well. There are two validation errors with the manifest 1) Spaces in the id of a representation. This should not cause any issue with playback. 2) Existence of additional attribute "streamName" in SegmentTemplate which should be ignored by any player.
Of course we will fix those issues to be standard compliant, but they are surly not responsible for the playback issue.
When you try to playback the stream with our player at [1] it does not produce any errors. The reason for the error in FlowPlayer and DashIF player is that the mediaPresentationDuration is a bit to long as already mentioned here. To be exact it is 0.000040533 seconds to long. In edge cases it can be hard to arithmetically correctly calculate the exact mediaPresentationDuration to not indicate an additional segment that obviously does not exist. That is why we also flag the last segments with the 'lmsg' flag. It is for sure not an error if players are tolerant in cases where they calculate a last segment duration of almost zero length. Additionally it would also be great if the 'lmsg' flag is considered correctly.
Nevertheless, we will also try to fix that in our encoding system by making use of fractions of seconds in the mediaPresentationDuration to also have it right for those edge cases.
The problem is that use of the lmsg
brand is under-defined - the MPEG spec says nothing of player behaviour in this scenario, and the DASH-IF IOP only mention the live case where their use makes sense when the duration is unknown and seems like a reasonable solution (though I imagine lmsg has now been superseded by use of DASH events). In the on-demand case the duration is known at publish time and should be correct.
The problem with ignoring the last segment if its duration is virtually zero is that, if its length is non-zero, it must, by definition, contain media data to cover the remaining duration. Ignoring this for compliant streams simply to cater for non-compliant implementations seems like a bad idea.
For situations where audio and video streams are not exactly the same length it can be tricky to impossible to define a mediaPresentationDuration that fits both streams. What would be the proposed solution from the dash.js player for such cases?
We chose to support 'lmsg' for such cases. Other vendors like Unified Streaming have the same approach [1].
[1] http://docs.unified-streaming.com/documentation/vod/troubleshooting.html#audio-is-longer-than-video
I would expect the minimum value (duration of the shortest component) to be used if the durations of the different components are out of sync.
The durations of the streams are usually not out of sync but can be different because of AAC padding (1024 samples). However, there are cases where the streams duration of the source file are different already. When always using the shortest duration of the streams we had complaints especially with very short clips (< 30s) that some video frames where missing at the end of the video.
We will probably generate an additional segment list manifest for dash.js player specifically to avoid the segment not found error for now.
We need to decide what we expect dash.js to do when presented with media of this nature so we can close this issue.
To summarise:
For the reasons above (https://github.com/Dash-Industry-Forum/dash.js/issues/1455#issuecomment-227978296), I'm not keen on the first. I definitely don't think we should do the second.
I believe this is a content provider issue.
@wilaw, given the arguments above, can DASH-IF provide guidance on how to resolve this issue. Should players be expected to handle the lmsg
brand in the on-demand case and, if so, what is the expected behaviour?
@bbcrddave - I will ask DASH IF IOP for rulings on these two questions:
Stand by.
@wilaw I think this is a common problem that not only relates to us but also to e.g., unified streaming according to their documentation (http://docs.unified-streaming.com/documentation/vod/troubleshooting.html#audio-is-longer-than-video) if you deal with audio and video of different length. As mentioned its working with our own player but we also want to provide content with segment template that works with dash.js player.
The current workaround is that we provide a special segment list mpd for the dash.js player which obviously is much larger in size and impacts the startup behavior negatively for dash.js users.
I spoke with Thomas Stockhammer today. He said that using lmsg for VOD content is fine, even though it is not explicitly addressed in the current IOP docs. I have filed a github issue with the DASH IF IOP group to add VOD-lmsg usage to the next rev of the docs.
So dash.js should use lmsg if it exists. The trouble is that the only way to know if it exists is for the player to parse every segment it receives. This results in a lot of wasted CPU effort since most content does not embed lmsg today. Question for @schellkenig and @msmole - does Bitmovin signal in the manifest that is embedding lmsg in the segments? Or does your player parse every segment it receives when playing VOD content?
@bbcrddave - I also think that we could make some practical decisions here to make dash.js more robust in the real world. If I get a 404 on a segment and the current duration is within one frame of the signaled duration, would it not make sense to declare EOS? For the purists, we could make an API to turn this off, or to tighten this threshold value.
I still believe that lmsg needs significant definition: what should dash.js actually be doing with it?
In practical terms, at least in an MSE implementation but I suspect also in most DASH implementations, the timeline cannot advance if there is not data for all media, so it is unclear to me how this is practically supposed to work.
The biggest problem for me, though, is the issue of how to handle the player seeking past the segment with the lmsg brand. In this case you are still going to get a 404, and the likelihood increases as the gap between the end of media data and the signalled duration increases - there's nothing to stop lmsg being used at any point to signal that a Representation is complete.
I'm all for making dash.js more robust in the real world, but what you're suggesting appears to normalise an error condition which I don't think is a sensible thing to do. 404s always indicate some error in the stream, whether with the serving or the content preparation, and I don't feel it's acceptable to say that they're ok if you're near the end of the stream or similar when, ultimately, you should never have been told that segment was available.
Note that I am not a purist - this is an operational issue affecting our users and we wish to see a resolution. Clearly we'd rather that the onus was on content preparers to encode, package and signal valid streams but if they are not able to do this for whatever reason then it'd be preferable to have a standardised mechanism for the player to cope.
@wilaw To come back to the question on how our player handles the lmsg. As we parse every segment anyway, checking for lmsg does not introduce additional overhead. Currently, we do not signal the existence of lmsg in the manifest. However, I really like the idea to make it explicit.
Playback issue related with this issue has been "fixed" with the implementation of "jump gaps" feature. Beside other things, it is able to manage playback correctly when there are small length differences in video/audio tracks.
Note: "Jump gaps" is a feature that makes this play but it doesn't fix the root of this issue (lack of implementation of lmsg styp box parsing). For example, when trying to play the sample stream provided in this issue description, playback works but dash.js raises a segment download error because it tries to download a segment that doesn't exist (one of the tracks is shorter than expected).
Given said this, I am going to close this issue and all the progress of lmsg box parser implementation will be documented in its own issue (#1715). Just to avoid duplication of comments.
Sometimes player searching for segment that not would exists. Bugged player are flowplayer and dashif player; other player (like bitmovin) works well.
This is mpd url: http://video.internazionale.it/2016/05/20/250432_3aa47248f7d92a2b8d82de8ae50fbac8/250432.mpd
Test page with flowplayer: http://www.internazionale.it/flowplayer-dash-issue.htm
Thank you
Flavio