MPEGGroup / CMAF

Official MPEG repository to discuss public issues on CMAF (ISO/IEC 23000-19)
2 stars 0 forks source link

frame rate signaling #11

Closed haudiobe closed 7 months ago

haudiobe commented 4 years ago

The frame rate of the content is an important parameter for many systems standards. However, the CMAF Header does have no means to signal the frame rate. One may determine the frame rate from SPS

but the above only applies for specific media profiles.

Hence the proposal is to add field in the media header to signal the frame rate of the content, and provide if it is fixed or if it is a max value. For media profiles, the should be a description how to determine following the above.

haudiobe commented 4 years ago

Proposal from @cconcolato

Get moov/mvex/trex/default_sample_duration (e.g. 1001) Get moov/trak/mdia/mdhd/timescale (e.g. 24000) The frame rate is timescale/default_sample_duration (e.g. 24000/1001)

mikedo commented 4 years ago

And, then there is temporal sublayering (signaled in the NAL)...

sdp198 commented 3 years ago

The problem with using the trex default_sample_duration is that it may not be set (there's a high chance of it being zero giving a division by zero error when you try to compute the frame rate :-) ).

Infact I'd say it's unlikely to be set because 7.7.3 of CMAF says that these values must be stored within the trun or tfhd (ie always contained within a CMAF Fragment. So there's been no point for a packager to set them in the trex.

For the SPS method, of course that doesn't work for avc3/hev1 if there's no parameter sets in theCMAF Header, but also I'd say it's 50:50 as to whether timing information is in the VUI. Also I don't think time_scale/num_units_in_tick is correct for AVC as I seem to remember there's a x2 multiplier for progressive content.

(Sorry these are only pointing out problems, not solutions - within our workflows I've always had to manage frame rate out of band)

cconcolato commented 3 years ago

So there's been no point for a packager to set them in the trex.

I agree that some packagers don't do it but there is a point in doing it, precisely to provide frame rate information. We could force setting it in a new structural brand ...

sdp198 commented 3 years ago

If something new is going to be added I'd rather see a new box to explicitly describe the frame rate, rather than a brand to say that it can be inferred from defaults.

cconcolato commented 2 years ago

@dwsinger I wonder if you have an opinion on introducing a new box (or sample group) to indicate runs of constant frame rate samples.

dwsinger commented 2 years ago

We've historically not talked about "the" frame rate because of at least

If it were an informative piece of information, there's a risk of it being wrong (or missing). If it's normative, it would take a major change to the basic timing structures of the file format.

Kilroy didn't want values in the movie header because he wanted to be able to ignore it – so he could switch tracks without bothering to check whether they used the same init segment. I never understood this; comparing two URLs for equality is not hard.

So overall, I like the idea that if your content has a constant frame rate, signal it in the trex default_sample_duration (and somehow promise that you don't always override it, probably with a brand).

cconcolato commented 2 years ago

The issue is related to how to generate the manifest from the CMAF content and this is not easy. We agree to draft a recommendation to set default_sample_duration and that it should not be overridden. We need to be careful about codecs that already carry frame rate information in the bitstream.

sdp198 commented 2 years ago

When you say it should not be overridden, do you mean "not overidden with something different". Because the existing spec requires it to be overridden in every CMAF Fragment. Changing that requirement (saying sample durations are either optional or prohibited inside CMAF Fragments) would be a significant incompatibility with the existing spec.

Also beware that at present the trex has to be identical for all tracks in a switching set, which would prevent setting accurate defaults there if you have mixed frame rates (eg 25 and 50 Hz) in a switching set.

haudiobe commented 2 years ago

addressed in https://dms.mpeg.expert/doc_end_user/documents/137_OnLine/wg11/m58936-v2-m58936-Issues-r1.zip.

Proposal: Add to clause 9.2.3 Track Header Bix

— The default_sample_duration should be set to be able to deduce the frame rate of the video content from this information. The value of the sample duration should not be overridden by the Track Run Box ('trun'). NOTE Manifest may use the default_sample_duration, if present, to extract the frame rate of the video content. If the sample duration is changed in a CMAF track, display parameters may need to be updated, possibly resulting in a non-seamless experience.

dwsinger commented 2 years ago

I don't think this quite does it. There are occasions when the decode duration does not follow a constant cadence, even when the frame rate is constant (at least I recall being shown such examples in the past).

How about something like:

When the xxxx brand is present, default_sample_duration shall be either 0 (indicating unknown or non-constant frame rate), or be equal to the timescale divided by the composition frame rate.

It might be overridden by some samples because of decode duration variations, but that doesn't matter: you wanted to know the composition frame rate, and this supplies it.

sdp198 commented 2 years ago

I like @dwsinger's proposed text. But I think we should add a note referring to section 7.7.3. Note: Section 7.7.3 requires the sample duration to be carried in every CMAF Chunk even when a default is set here.

Also Table 11 (General constraints on CMAF header boxes in CMAF switching sets) needs to be updated to change: trex Identical To: trex Shall be identical except for default_sample_duration

porcelijn commented 2 years ago

How about something like:

When the xxxx brand is present, default_sample_duration shall be either 0 (indicating unknown or non-constant frame rate), or be equal to the timescale divided by the composition frame rate.

:+1: I like the idea of having at least a hint (ie. xxxx brand) of when the mvex+trak info would be accurately reflect frame rate.

But wouldn't it be easier and more natural to interpret frame rate as a sample property, rather than an extension related to fragments?

Usually there are reliable (but codec-specific) ways to glean frame rate from the specific SampleEntryBox. A generic alternative would be to introduce an explicit FrameRateBox, or designate a meta box and require adding it to the sample description in order to comply with profile brand yyyy — similar to existing btrt, colr, pasp, chnl, etc.

For one, solving this at the sample entry potentially benefits all mp4 uses (including progressive), not just CMAF.

cconcolato commented 2 years ago

One more thought. Given that all video specs define their levels in terms of max pixel refresh rate, if a decoder is compliant to a level and you know the (minimum) width and height of the video, you could determine the max (compliant) frame rate. This is similar to deriving the frame rate from the CMAF Media Profile brand. Are we saying that some levels (and CMAF profiles) are too broad and some devices can tolerate a stream at a given level but only up to a frame rate smaller than the max frame rate?

jpiesing commented 2 years ago

One more thought. Given that all video specs define their levels in terms of max pixel refresh rate, if a decoder is compliant to a level and you know the (minimum) width and height of the video, you could determine the max (compliant) frame rate. This is similar to deriving the frame rate from the CMAF Media Profile brand. Are we saying that some levels (and CMAF profiles) are too broad and some devices can tolerate a stream at a given level but only up to a frame rate smaller than the max frame rate?

I might be missing your point but .... Remember that the video decoder is part of a wider system & just because a video decoder can do (e.g.) 2K120, it doesn't mean that whatever is between the decoder and the panel can also do 120Hz. There may be components between the decoder and the panel that can do 4K60 but no higher frame rate.

porcelijn commented 2 years ago

@jpiesing :+1: Good point. I think the CMAF Media Profile brands are an upper bound and codec profile levels are meant to constrain decoder capabilities, whereas frame rate might be limited by display (or even cabling?) capabilities — which could be distinct, but also environment context like user preferences or battery level.

Maybe it's best to think of frame rate as an elementary stream property that's orthogonal to codec, resolution, bitrate, etc. Then the introduction of a "FrameRateBox" as part of SampleEntry sounds like the proper solution. In its simplest form it would need only a ticks_per_frame property (expressed in media timescale) and its presence would imply constant bitrate. To also cater for variable frame rate a min+max (or avg+max, similar to btrt) pair could be used, and min_ticks_per_frame == max_ticks_per_frame would imply constant frame rate. (Of course the max frame rate would still need to comply with the (CMAF) profile (ie.: <=)

Okay, I'm jumping way ahead here — does this make sense?

cconcolato commented 2 years ago

@porcelijn @jpiesing Indeed, I realize now that there are cases where a stream cannot be signaled as using lower level but yet does not reach the maximum frame rate for that level and you want to signal that, in case you have a decoder that could do the max frame rate but a render that cannot.

@porcelijn I disagree with the need for a FrameRateBox, for the reasons indicated above.

All, I think the current quasi-consensus is:

Can we agree on that?

dwsinger commented 2 years ago

might be good to allow overriding the defaults, but only to cover 'timing glitches' and say that there shall not be a succession of N frames that exceed the frame rate stated by timescale/default_sample_duration. If N is 2, that's very tight.

cconcolato commented 2 years ago

@dwsinger I think it's generally covered in the "should not change in fragments". I'm not sure we need to be specific about cases where this "should not" can happen. It seems application specific (DASH-IF, ...).

dwsinger commented 2 years ago

ah, that 'should' implies some flexibility. maybe that is a little vague about how/when/how-much one can break the rule?

RufaelDev commented 2 years ago

I think this is also in ISOBMFF issue, if I have a fragmented file that stores a framerate, I would also like to be able to preserve that in the progressive file. Using trex in a progressive file doesnt make much sense. Second, I also wonder how the case of multiple framerates would be addressed. Third, as @sdp198 mentioned, especially in the cmf2 case (the original cmaf) sample default values are not really used as sample property signalling is in trun/tfhd. Last, I am wondering if brands are really implemented consistently throughout how well can we rely on brands ? My preference would be a box in stsd or sample entry that can be quietly ignored by parsers that don't understand it similarly to colr/pasp type boxes.

RufaelDev commented 2 years ago

We prepared a contribution m600004 https://dms.mpeg.expert/doc_end_user/documents/139_OnLine/wg11/m60004-v1-m60004.zip on signalling framerate in ISOBMFF, We took a bit different approach from the preliminary consensus here as this approach does not work for progressive mp4 files. We think it would be good to have a solution for both cases of progressive and fragmented MP4. We present an optional box to be used in stsd or visual sample entry that can be ignored similarly as the pasp and colr boxes. This way one could also signal multiple framerates by inserting multiple framerate boxes, this is also something not supported In the current proposal. We mainly would like to discuss it in the File format group in the upcoming meeting, we understand that it should first be discussed there, but we would also like the CMAF group to be aware of the proposal, mostly as to understand the limitations that the current solution discussed on github poses. For example, multiple framerates or progressive MP4 files should be supported, and it would also be good not change the packaging structure of existing content too much. Adding a box in the header would avoid touching fragments/segments etc to comply with the current proposal here. By forcing a default sample duration segments may need to be changed accordingly, we think in our proposal this is much less the case.

cconcolato commented 2 years ago

@RufaelDev Some immediate comments:

and then:

Where possible, the value of the timescale field should be chosen such that when the frame rate is constant, the value of the media sample duration may also be constant.

I am wondering if we should not clarify first that within a CMAF track the frame rate shall be constant. In practice, I've seen devices in the field that do not support changing the frame rate without reinitializing the decoding pipeline.

In my view, we should implement the previous suggestion.

RufaelDev commented 2 years ago

On the moov case we have seen for example, files using smooth streaming timescales where samples may shift 1 or 2 ticks, this makes the framerate or intended framerate calculation not directly derivable. Further, a basic feature is the ability to fragment and de-fragment and keep the information and metadata, in the current CMAF proposal, after de-fragmentation the frame rate information is lost.

Mandating constant frame-rate is optimistic and may not always be achieved in practice! I dont think all CMAF tracks use constant framerates, this is one of the flexibilities the file format provides!

I recommend the CMAF group to seriously consider the flaws in the current CMAF proposal, it would require repackaging existing segments and changing segments, it has no explicit signalling and de-fragmentation and fragmentation is not supported as the information is lost, In my opinion this violates some of the core principles of the ISO BMFF file format standardization principles, especially the support of fragmentation and de-fragmentation .

The proposed box is optional, it can be silently ignored, can be used in profiles and only affects the moviebox not the segments. Again, the proposal is for file format and not for CMAF at this time, but I think CMAF should not adopt this improvised proposal of using fields intended for timescale and duration to signal the framerate! Timescale/duration and framerate are related but not the same thing!

cconcolato commented 2 years ago

On the moov case we have seen for example, files using smooth streaming timescales where samples may shift 1 or 2 ticks, this makes the framerate or intended framerate calculation not directly derivable.

It feels like inventing a new tool to fix misuse of the existing one. I'd rather clarify how the existing one should be used (essentially use a correct timescale). It seems an implementation issue to have to deal with invalid files.

Further, a basic feature is the ability to fragment and de-fragment and keep the information and metadata, in the current CMAF proposal, after de-fragmentation the frame rate information is lost.

The information is not lost. It's stored differently in the fragmented and non fragmented case. In the non-fragmented case, the information is in the stts/ctts.

Mandating constant frame-rate is optimistic and may not always be achieved in practice!

I would be interested in concrete examples of frame rate changes within a CMAF track. Is it changing between rates that are multiple of each others (e.g. 25 fps then 50fps then 25) or is it between completely incompatible ones (e.g. 23.976 then 25 then 29.97 ...)?

I dont think all CMAF tracks use constant framerates, this is one of the flexibilities the file format provides!

CMAF is a profile, it's about restricting the flexibilities of the file format.

I recommend the CMAF group to seriously consider the flaws in the current CMAF proposal,

I still don't see the "flaws". Adding a framerate box, even if optional, is opening the door to tools putting invalid information in the stts/ctts/trun boxes and relying on players using this new box.

it would require repackaging existing segments and changing segments

It would only require repackaging the initialization segment, which would also be the case with a frame rate box.

porcelijn commented 2 years ago

[...] smooth streaming timescales where samples may shift 1 or 2 ticks.

It feels like inventing a new tool to fix misuse of the existing one. I'd rather clarify how the existing one should be used (essentially use a correct timescale). It seems an implementation issue to have to deal with invalid files.

I guess it's fair to claim that smooth streaming timescales are wrong from CMAF perspective, but they come up in reality. Conversely, isn't proposed use of mvex default sample duration a misuse to fix the omission of frame rate signalling in ISOBMFF?

The information is not lost. It's stored differently in the fragmented and non fragmented case. In the non-fragmented case, the information is in the stts/ctts.

We loose the information when for example timescale=100 and framerate = 25, so all mvex default sample duration = 4 as it would be in your proposal. But now, additionally, somewhere in the there's a moof that contains a sample with duration 3, or 5. If we transform that to progressive and then back to CMAF we technically have a variable samplerate, which might mean variable framerate OR constant framerate=25 fps OR some other constant framerate; we have to guess here because the mvex info is lost.

I would be interested in concrete examples of frame rate changes within a CMAF track. [...]

Yes, the drop frame use case is a common source of confusion that could be signalled unambiguously once and for all with @RufaelDev 's proposal. Another case (related to above example) is ad insertion scenario. Consider concatenating 2 tracks 25fps video with identical codec configs, where the aac audio forces a misaligned end time of the very last sample of the first video track. In such cases, it is often unfeasible to transcode all audio or all video or both and simply adding or removing a tick to the last sample of the first track in the the video timeline is quite acceptable.

I still don't see the "flaws". Adding a framerate box, even if optional, is opening the door to tools putting invalid information in the stts/ctts/trun boxes and relying on players using this new box.

IMHO, the flaw lies in the inability to signal framerate, instead relying on inference from sample duration. Fundamentally, there is a subtle difference between sample rate and framerate, and just ignoring those differences and treating every case where those differences persist as "invalid" is not very practical.

dwsinger commented 2 years ago

IMHO, the flaw lies in the inability to signal framerate, instead relying on inference from sample duration.

I think it's rather the opposite: that thinking that there is a constant framerate means an inability handle situations where (for example):

and so on. If the content adheres to the profile and specifically level constraints of the codec, a conforming decoder is required to decode it. Any decoder that can only cope with content that has a 'constant frame rate' is not conforming. Why are we trying to give them a pass?

RufaelDev commented 2 years ago

@dwsinger I understand your points, our intent is we dont want to override infer any decode/sample timing requirements from the framerate box at all. I think sample timing always go above the framerate signalling which is supplementary, the main goal is to store this information in the mp4 file as to use it later on when generating the MPD or playlist. In fact, I would expect framerate signalling to be independent of the actual sample timing due to these subtle differences that can happen, in fact, this is one reason why we opposed the idea to use default sample duration and timescale in addition to @sdp198 comments on the original requirements of CMAF. So for this proposal, I would like to say that for example even a 3s contstant frame could be rendered at 25 fps for example, it is mainly a property intended for the device and usage in broadcast standard. I would see the framerate more as a property for intended rendering/decoding rate, as an exact timing construct. I know it sounds a bit vague but most distribution standards require framerate signalling and for consistency it would be good to be able to store it in the file format as well. Maybe adding some notes to the text of the framerate box description would be helpful to clarify this.

porcelijn commented 2 years ago

Yes, @RufaelDev! That is exactly what I think the framerate is supposed to signal. For example: if I have a track at 50fps and one at 25fps, a player might select the appropriate track based on what it knows about attached monitor. It should not use it to infer the mp4 sample timeline, which is explicitly signalled already.

But conversely, a player shouldn't need to scan the sample timeline or make an educated guess on what mvex default sample duration might mean if it just needs to consider whether a track is appropriate for a given display.

Another example: suppose, I'm trying to insert a third party ad. I have renditions of the advertisement at 24fps, 25fps and 30fps bundled in a single progressive mp4. The main source has been encoded as CMAF at 25fps and 50fps. Extracting and fragmenting a track from the ad is one thing, but how should I select which track to insert? Wouldn't it make sense to have all that info in a universal format, one place? Why should I have colr and pasp readily available in the SampleEntry, but need to jump through hoops to figure out the best match one the framerate axis?!

Samples and framerate are often strongly related, but they are not the same. The ISOBMFF sample concept describes the sample timeline, now all we need is a way to signal framerate. (Especially, now that ISOBMFF codec bindings like AV1 are pushing to omit this information from the codec-specific sample description.)

cconcolato commented 2 years ago

I think that one of the reasons we can't agree on a solution is because we don't have a common understanding of the use cases and problems. Let me try to summarize what I understand: a. CMAF seems to tolerate variable frame rates but it is not very explicit. It says "when the frame rate is constant". b. DASH is more explicit as it defines frameRate saying "If the frame or field rate is varying, the value is the average frame rate" c. HLS defines FRAME-RATE as the maximum frame rate within the video stream, so implicitly supporting variable frame rates. e. I think we should distinguish between true variable frame rate: i.e. when potentially all frames in a stream have a different duration, vs. when the frame rate changes infrequently: 1) piece-wise constant frame rate (e.g. splicing a 25fps video with a 24fps video) or 2) spiky frame rate transitions (e.g. when splicing 2 25fps videos and the last frame of the first video is cut short). d. Frame rate is not only a feature of the decoder but it is also a feature of the rendering pipeline (e.g. display). f. It is not clear to me that all CMAF-compatible devices do support variable frame rates, even less true variable frame rate. In fact between CMAF constrains frame rates in Tracks of a Switching Set to be multiple of each other. It would be interesting to get feedback from CTA-WAVE on this. g. It is not clear to me that devices actually care about the frame rate that is signaled in manifests (HLS, DASH) beyond selecting High Frame Rate / Standard Frame Rate, maybe to initialize the rendering pipeline. It would be interesting to get feedback from player implementers on this. In my experience, browser-based implementations don't care about it and even if MediaCapabilities includes it in its configuration, it is currently ignored. h. Then there is the Microsoft Smooth Streaming case. MSS recommends mdhd.timescale of 10_000_000, which makes it impossible to represent exactly the duration of a frame as an integer for non-integer frame rates. i. I think the problems we are trying to solve are:

  1. given a CMAF track header only, how to generate a DASH/HLS manifest and set the frameRate/FRAME-RATE correctly. It is acceptable to update CMAF to specify changes to the CMAF Track Header to include information but it is not acceptable to change media segments.
  2. given an MSS-compliant CMAF Track, how to round-trip to/from a non-fragmented version while still allowing 1.

Am I missing anything? any constraints?

Assuming I captured everything, my proposal would be a slight modification of https://github.com/MPEGGroup/CMAF/issues/11#issuecomment-1077814289:

One more point, this should be applicable to audio tracks. Today, in fragmented mp4, one has to look deep at codec-specific configuration to know the frame duration.

RufaelDev commented 2 years ago

Thanks for the summary on this appreciated, just a few comments suggestions:

a) Can we create a separate issue for this ? This seems to be separate of the option to signal the framerate. Demanding constant framerate for CMAF tracks will have much more impact on existing deployments than introducing optional box or optional signalling.

b,c) DASH/HLS support is important yes, but keep in mind that for DASH it is optional, for broadcast grade streaming to native devices (SCTE/ATSC/DVB), standards often determine the requirements on the decoders number of pixels and framerate this may be the maximum a native decoder can handle. Such native devices may need number of pixels x framerate product to estimate if they can decode that representation.

This is also why trick play representations use lower number of pixels to allow decoding at higher framerate on native/capped decoders. I think this is one of the main use cases for framerate in such standards.

e,d) Our argument is that we want to signal framerate as a metadata for intended display, or for generating the manifest for native streaming standards (mostly based on DASH), or for fragmenting/defragmenting and storing. We dont want this metadata to overrule ISOBMFF timing which should be leading in all cases. We see this as informative metadata.

f) Again, the support of variable framerates, can we please discuss it in a separate issue ? I think, as in this case you also want to consider only video as different types of content metadata/timed text for example by have variable rates almost by definition.

h) Yes the smooth streaming timescale is one example but there are others, for example SCTE often uses 90000 or 240000 timescales including non integer framerates that may experience similar issue.

I really think we need to zoom out a bit, we want to be able to signal the framerate as --optional-- and --informative-- metadata without demanding strict coupling to the iso file format timelines which must be leading in all cases.

Given our proposal to the file format group also of interest to the CMAF group, to clarify this maybe we can introduce and change the naming to "appoximate framerate box" or "informative framerate box" so that it is clear that this has no requirements over other normative timing constructs. Other users can then chose to use this optional box if they want. I also think some additional fields may be introduced.

On the point of audio, as a ISOBMFF sample would contain multiple audio samples, but timescale is still often equal to sample rate to allow accurate start time signalling of the samples, so I really think framerate should apply to video/visual.

dwsinger commented 2 years ago

We seem to have two cases: non-compliant playout chains that can't cope with varying timing, and the need to know the 'target' or 'overall' framerate for selection purposes. Are there others?

It seems that for a non-fragmented movie, one can get the number one wants by looking at the movie or track duration, and the sample count. That gives the close approximation to the framerate that is needed for selection purposes.

For fragmented movies, one might have a moveextendsheader to give you the expected overall duration, but there is nowhere now to get the expected sample count.

we could surely add a box to the movieextends box with target_samplerate, which should, over some reasonable interval, be 'close' to the number of samples divided by the elapsed composition time between the beginning of composition of the first and end of composition of the last.

mohammedsraad commented 2 years ago

How does introducing an optional frame rate box not solve the issues raised here without causing significant disruption to existing deployments?

krasimirkolarov commented 1 year ago

MPEG 140: Discussion ongoing in the File Format group. Keep this until resolved. Participants to comment further here.

RufaelDev commented 1 year ago

At the last meeting there was a discussion in file format group, following a liaison from DVB and SCTE. The main idea of supporting frame rate is to support some end-to-end workflows without out of band information. In addition, there was a proposal to add (informative) framerate signalling in stsd by introducing a new box as mentioned in this thread, m60004 https://dms.mpeg.expert/doc_end_user/documents/139_OnLine/wg11/m60004-v1-m60004.zip
In addition a response was sent that referenced currently available technology that is

  1. MPD 2 a new UUID box 3 part 15 signalling in decoderconfigurationrecord

proposal m60004 was adopted in the TuC of file format.

It may make sense for CMAF group to consider the outcome from the fileformat group.

haudiobe commented 1 year ago

Proposal: We should review the progress in the File Format

RufaelDev commented 1 year ago

@haudiobe that would be helpful, the latest compromise in WD amendment is here: https://dms.mpeg.expert/doc_end_user/documents/142_Antalya/wg11/MDS22616_WG03_N00874.zip

it combines the trex/mdhd proposed by @cconcolato combined with an explicit flag to signal it is an estimate, review/opinion of CMAF experts would be helpful.

In practice 14496 part 15 HEVC Decoder Configuration Record and/or NAL units structures based on VPS/SPS are often used to estimate framerates but it is not a generic solution.

krasimirkolarov commented 1 year ago

We will keep that open until it is resolved in File Format and then follow up here

cconcolato commented 11 months ago

We welcome contributions on how to update CMAF to reflect the new flag in ISOBMFF.

podborski commented 8 months ago

I think this one was addressed in File Format group, right @cconcolato ?

mohammedsraad commented 8 months ago

It was, yes.

On Thu, Jan 25, 2024 at 12:53 AM Dimitri Podborski @.***> wrote:

I think this one was addressed in File Format group, right @cconcolato https://github.com/cconcolato ?

— Reply to this email directly, view it on GitHub https://github.com/MPEGGroup/CMAF/issues/11#issuecomment-1909053033, or unsubscribe https://github.com/notifications/unsubscribe-auth/ASEBLXSE4SFGKQEEFESW6NTYQGGIHAVCNFSM4PGXQ7JKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCOJQHEYDKMZQGMZQ . You are receiving this because you commented.Message ID: @.***>

-- Mohammed Raad, PhD. Partner RAADTECH CONSULTING Email: @.***

podborski commented 7 months ago

Closing as it was addressed in ISOBMFF.