IIIF / iiif-av

The International Image Interoperability Framework (IIIF) Audio/Visual (A/V) Technical Specification Group aims to extend to A/V the benefits of interoperability and the growing ecosystem of clients and servers that IIIF provides for images. This repository contains user stories and mockups for interoperable A/V content – contributions are welcome.
http://iiif.io/community/groups/av/
Apache License 2.0
13 stars 3 forks source link

How might we handle adaptive streaming in a video info.json? #57

Open jronallo opened 7 years ago

jronallo commented 7 years ago

This question is a specialization of how we are handling progressive download videos by listing them out as individual sources. You can see an example of listing sources in https://github.com/IIIF/iiif-av/issues/50

With adaptive streaming like HLS and MPEG-DASH what information should we include for a source? Each of these formats basically points to a document which is a manifest (media presentation description) of all the different adaptations that are available for a video. The client can then select which to play based on bandwidth and resolution and switch sources mid-stream based on conditions. In some cases a single file is used for each adaptation (on demand profile which then uses byte-range requests for segments) and in others the source video at different bitrates and resolutions are each pre-segmented (live profile).

The assumption is that for some cases multiple sources are still likely to be listed even with an adaptive version available. This could be because both an HLS and MPEG-DASH stream are made available to reach more devices. (Currently these both use different underlying media formats and so work on different platforms--though the latest HLS allows for fragmented MP4s as MPEG-DASH has been using.) The listed sources could also include a progressive download source to work for clients that don't support the adaptive formats and that can't run a client-side library (e.g. hls.js and dash.js).

I could see this going two ways. In both cases we would list the URL to the media presentation description document as one of the sources. We would list out information relevant to the source at a high level like format. But then we could either stop there, or we could also list out the available adaptations within each adaptive streaming source. Basically the server creating the info.json for a video could look inside the media presentation document and pull out information about each adaptation. This seems duplicative but maybe useful to have this information all with the single request for the info.json? Would it ever change the decision about whether to play an adaptive stream based on which adaptations are available?

Though it isn't required it is also common, recommended practice for the audio adaptation(s) to be separate from the video adaptation(s). This means that if we were going to describe the adaptations for a source we would be describing the technical details of both the audio and video which could either be separate files or multiplexed.

Thoughts on this? Other ways we could handle adaptive bitrate streaming in a video info.json?

jwd commented 7 years ago

Thanks for writing this up, Jason. My view would be not to make things too complicated and to follow your suggestion of just having an entry in the info.json pointing to the adaptive streaming manifest (M3U8 or MPD) along with as many properties as are common to and known about the sources referenced by that manifest. I agree it should be possible to list adaptive sources alongside standalone sources.

azaroth42 commented 7 years ago

Completely agree with Jon -- lets start simple and get traction first. That would mean simply pointing to the streaming manifest, with the same basic properties as for the other formats.

jronallo commented 7 years ago

Sounds good to me. I'll work on adding an example to a my prototype.

azaroth42 commented 7 years ago

(outsourced as Prezi can just refer to HLS, and clients use the format specific info)

azaroth42 commented 7 years ago

Propose close, out of scope / solved by adaptive formats already

bvibber commented 7 years ago

Agree with close / out of scope.

zimeon commented 7 years ago

This issue came up on 2017-03-07 AV call in the context of a set of streams at different bitrates from Avalon. Agreement on call was that adaptive streaming based on these is out of scope (use a format that supports it instead) and that the use case is really about user selection (where label for each stream indicating difference is adequate).