Open swick opened 5 months ago
One of the often overlooked potential issues with PQ based HDR for home viewing is that because the standard is absolute there is no way to increase the display's light output to overcome surrounding room light levels - the peak brightness cannot be increased, and neither can the fixed gamma (EOTF) curve.
Referring to PQ as an 'absolute' standard means that for each input data level there is an absolute output luminance value, which has to be adhered to. There is no allowance for variation, such as changing the gamma curve (EOTF), or increasing the display's light output, as that is already maxed out. (This statement ignores dynamic metadata, more on which later.)
Source: https://lightillusion.com/what_is_hdr.html
Personal opinion: One thing HDR10 tried to achieve is that the image at home ought to match what the engineer saw on his monitor (and then fall off where the TV is inferior). That's why you need to adjust your viewing surrounding, because you can't increase brightness to adjust for daylight viewing conditions. Something for which TV manufacturers have added new settings to mitigate the issue.
I would like to have a more authoritative source on this. The spec itself never talks about absolute luminance, it only assigns the reference display luminances to code points which could be done with any spec such as sRGB.
One thing HDR10 tried to achieve is that the image at home ought to match what the engineer saw on his monitor (and then fall off where the TV is inferior).
That seems to be true for any other spec as well. If you match the viewing environment at home, you should see what the engineer saw on his monitor.
I think this is at best very confusing. For example,
rec2100-linear
has the exact same peak white luminance, black luminance and white luminance but isn't said to have "absolute values". Are the values behaving differently? I don't believe they should.
Yes, they are - and the difference is the PQ transfer function which is defined in terms of absolute luminance values.
Contrast that with 2100 HLG, which is relative (0.75 code value is media white, which can be made brighter or dimmer).
sRGB according to the "Controlling Dynamic Range" section (https://drafts.csswg.org/css-color-hdr/#controlling-dynamic-range) has (reference and peak) white defined at a luminance of 80 cd/m² and black at 0.2 cd/m². Is this not absolute? How is this different than the PQ absoluteness?
That definition is also present in CSS Color 4 0.2. The Predefined sRGB Color Space: the [sRGB](https://drafts.csswg.org/css-color-4/#valdef-color-srgb) keyword. It is relative colorimetry: the reference display does indeed have a media white at 80 cd/m2 but industry practice is to turn this up or down to suit personal preference or viewing conditions.
This has been abused to not properly anchor SDR and HDR content and, at the actual display, have HDR content at a fixed luminance, while SDR content will have different luminances depending on some brightness setting.
It isn't clear how that is abuse.
I would like to have a more authoritative source on this. The spec itself never talks about absolute luminance, it only assigns the reference display luminances to code points which could be done with any spec such as sRGB.
What do you mean "could be done"? How is that relevant?
The PQ format handles display luminance values up to a maximum of 10,000 cd/m2 as absolute values. It introduces a new transfer function considering efficient bit allocation on the basis of human visual characteristics that cover a wide luminance range. As an absolute luminance format, video signals have a unique correspondence with luminance values reproduced on the display, which means that the range of video signals that can be displayed depends on the peak luminance of the display. The EOTF of this format was specified in ST 208412) of the Society of Motion Picture and Television Engineers (SMPTE) in 2014 as a reference display standard for use in HDR production.
Source: https://www.nhk.or.jp/strl/english/publica/bt/70/2.html
NHK is basically the Japanese BBC. So it's absolute in the sense that there is a 1:1 mapping of video signal to luminance on the display. If a signal would result in a luminance that's too high for the display to show, the 2020 spec only allows for some way of falloff like clipping. One is not supposed to fit the luminance range into what the display can show across the whole spectrum. Only at the very top for out of range values.
Making it fit somehow is what HLG was made for.
@chrisn
Source: https://www.nhk.or.jp/strl/english/publica/bt/70/2.html
Funnily it says, and I quote "The video signals of the HLG and PQ formats can be mutually converted. A framework for the conversion is described in Report ITU-R BT.2390."
If that's the case, then the PQ signal which got converted to HLG will be re-rendered by the HLG OOTF on the display to adjust for the viewing environment and whatever luminance was encoded in the PQ signal isn't the one which will be displayed.
PQ is no different than any other display referred signal. They all describe the absolute luminance on the reference display (i.e. the "absolute" luminance is referring to the reference display).
Displays do re-render the images to adjust between the differences between the reference display and reference viewing environment and the actual display and actual viewing environment. Mostly crude methods until now of backlight control, brightness and contrast sliders.
the PQ signal which got converted to HLG…
…is then not a PQ signal anymore. So how does that matter?
PQ is no different than any other display referred signal. They all describe the absolute luminance…
So you agree it's absolute? I'm confused about what point you're trying to make. Again, the spec basically says that for video signal X, the display is supposed to render at luminance Y, does it not?
…on the reference display
Which the viewing display is supposed to follow as closely as possible, right? Exactly because display-referred LUTs match the colours on the reference monitor. If the viewing display didn't follow that, what's the point in having a display-referred LUT/TF in the first place?
Displays do re-render the images to adjust between the differences between the reference display and reference viewing environment and the actual display and actual viewing environment.
But you don't have much control over that as consumer. The display will switch the LCD backlight to max brightness, if a HDR10 signal is detected. You're then at the mercy of the display in terms of tone mapping. But that's irrelevant from the POV of the spec, which defines a 1:1 mapping of input signal to luminance. Don't you agree?
Which the viewing display is supposed to follow as closely as possible, right?
No, this is probably the part that is so problematic. The reference display and viewing environment are in practice only followed when content is mastered but not when it is consumed. The solution has always been to "add a re-rendering step (OOTF) to account for non-reference viewing conditions" and is done by backlight/brightness/contrast control on displays most of the time. The quote is from css-color-hdr (https://drafts.csswg.org/css-color-hdr/#Compositing-SDR-HDR).
This is also exactly how SDR content is handled. sRGB has a reference viewing environment and a reference display which means an sRGB signal is "absolute luminance", but people do constantly re-render sRGB because they view it in varying viewing conditions.
If you call PQ an absolute luminance signal, so is sRGB.
It's not wrong to call them absolute luminance if you understand that this is only the case with a fixed display and viewing environment. If you have a variable viewing environment, neither sRGB nor PQ content should produce a constant luminance because the entire goal of the exercise here is to preserve the appearance.
Even more concerning, if you have an HLG signal and a fixed viewing environment, isn't the HLG signal not also producing absolute luminance?
Luminance is always absolute. If you have a fixed viewing environment and fixed display every signal produces absolute luminance.
What differentiates PQ and HLG isn't "absolute luminance", it is that PQ is display-referred and HLG is scene-referred.
But that's irrelevant from the POV of the spec, which defines a 1:1 mapping of input signal to luminance. Don't you agree?
No, see above. Even the spec currently allows for a color re-rendering step.
I think the changes to css-color-hdr I would like to see are
I'm on the same track as @swick here.
For human perception, (absolute) luminance is relative to its viewing environment. As an extreme example of environment dependency, any PQ encoded image displayed on a usual consumer PQ display will be destroyed when the display is brought into bright daylight without any adaptation. One has to choose whether to preserve the image appearance as graded (for some value of "preserve"), or fix the presentation luminance of the signal. These two goals agree only under the reference viewing environment, and disagree otherwise. One could of course say that all viewers must adapt their environments so that displaying an image nit-for-nit becomes appropriate, but I think that does not generalize very well outside of living rooms and theaters.
The above does not consider the physical limitations of a real display. Those limitations may forbid a sufficient adaptation to the environment.
I believe the term "absolute" in this context refers to a unit that is not relative to some maximum at hand (e.g. display peak luminance) but is relative to some global constant (e.g. reference viewing environment brightness).
Why is the white level of sRGB merely 80 cd/m² while PQ (and HLG on a 1000 cd/m² peak display) has 203 cd/m²? Where does this brightness creep come from?
That was kinda my point. From a user's practical POV, the HDR10 PQ spec is flawed in that way because it defines the viewing environment parameters for an unrealistic setup (too dark). Agreed - but it's not the implementation's job to fix that. It's something you'll have to live with, if you want to build a conforming implementation of HDR10. There are already different standards like HLG and HDR10+ that have taken that problem into consideration.
I guess what I'm saying is that, if I had a hypothetical HDR switch in my display settings, and I set it to HDR10 because that's what my display supports, I'd want it to work in a conforming manner. Warts and all. I want the burden of creating the right viewing environment to be on me. I don't want to have to second guess, if the software is trying to do me a favour. I need precise reproduction of colors/brightness, if I want to produce an accurate image, not a "nice" one. That's me wearing a content producer's hat, not a consumer's.
Just raising my concerns.
From a user's practical POV, the HDR10 PQ spec is flawed in that way because it defines the viewing environment parameters for an unrealistic setup (too dark).
The reference viewing environment. Just like sRGB defines a reference display with a maximum luminance of 80 nits. It's just that everyone agrees that we should re-render sRGB, but somehow that becomes controversial for PQ.
I keep repeating myself: there is a 1:1 mapping of video signal to luminance with the PQ EOTF. There isn't in sRGB.
Yes, you keep repeating it and missing the crucial point that the luminance is for the reference monitor in the reference viewing environment, so if you change the actual display or the actual viewing environment, the image requires a re-rendering step, just like with any other video signal.
And you keep ignoring that I'm trying to tell that that's the fundamental difference that makes PQ absolute.
But fear not, I'm not gonna bother you any longer. Cheers.
Are you saying that the 80 cd/m² of the sRGB reference display is not absolute?
We are drifting to the domain of window systems. A browser presents its contents through a window system, so it cannot escape the window system effects. I would even include the physical brightness and contrast knobs of a monitor or a TV here, since they are essentially indistinguishable from the window system, from the browser's point of view.
I consider a display that refuses to adjust contrast and brightness when presented a PQ signal to be broken. Broken hardware is often worked around in software.
@DanMan If you happen to be a content producer working in your studio, do you not calibrate your equipment as well as your environment? As part of that calibration, if you were using a window system that allows to work around broken displays, would you not also switch off that workaround or take it into account?
What about a home producer who cannot afford or cannot verify a studio-like environment? I would imagine they do the calibration by eye-balling some test images and adjusting their system, including the dynamic range mapping. Would that not lead to better production results for PQ material than trying to mimic the supposed "too dark" look?
Let's consider the alternative, and assume that PQ signal really is displayed nit-for-nit.
Let's say are you are a content producer. Your customers complain to you, that the PQ material you provided is too dark, and the only reason for that is that they consume that material in a dim home environment, or even in an office environment. What do you do?
If you modify the material to be brighter, then people in studio environments will complain that it is too bright. People in the office environment might still deem it too dark.
Do you tell those people to make their environment dark enough? "You're holding it wrong." Depending on your audience, would you not lose them?
Dynamic range re-rendering is not easy, even though we might sound like it's a ready solution for mismatching viewing conditions. I just think that doing something is better than nothing when that mismatch itself cannot be corrected, and we will find better ways of doing it over time.
What all the above means for the CSS HDR spec, I'm not sure. If wanted, I think the CSS HDR spec can choose to hold on to the promise of PQ signal encoding absolute luminance, and leave all this debate to window system and monitor and TV manufacturers as long as the spec makes no promises on their behalf.
SMPTE ST 2084:2014 says in Introduction (non-normative):
The EOTF does not impart a preferred rendering appearance for any particular viewing environment. Image modifications needed for viewer contrast, colorfulness, highlight details, and visible detail in shadows at any particular output level must be chosen as part of the mastering process.
but then it also says:
The reference EOTF and its inverse represent an efficient encoding system for high luminance range data. Though an idealized display device could follow this EOTF exactly, in real world displays the EOTF can be thought of as a nominal target. Actual displays can vary from the absolute curve due to output limitations and effects of non-ideal viewing environments.
This seems contradictory to me. The latter paragraph seems to allow image re-rendering to the viewing conditions at hand.
Unless, where is "the mastering process"? A PQ signal going from a computer to a monitor makes sense, if the mastering process is in the computer (or TV receiver or video player) rather than before them (in the broadcast or video production).
Would anyone happen to have a link to HDR10 specs? I have never found anything better than what we noted in https://gitlab.freedesktop.org/pq/color-and-hdr/-/blob/main/doc/hdr10.md#hdr10-media-profile . What's the HDR10 definition of the reference viewing environment?
Btw. for Wayland the thought has occurred to me that something might want to display nit-for-nit: https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/28
There are already different standards like HLG and HDR10+ that have taken that problem into consideration.
What is the reference viewing environment for HDR10+, anyone have a pointer? The HDR10+ White Paper is super vague on this:
Image choices are driven by looking at a “Mastering Display” in a darkened room which typically has greater capabilities than available to consumers and is in an ideal viewing environment.
One might guess that SMPTE ST 2080-3:2017 'Reference Viewing Environment for Evaluation of HDTV Images.' is meant, but I would prefer not to guess.
As I understand it, the whitepaper references BT.2100 as the base of the HDR10/HDR10+ systems, which itself defines the viewing environment:
TABLE 3: Reference viewing environment for critical viewing of HDR programme material
I tried to explain the differences between a theoretical Absolute HDR Mode and an Adaptive HDR Mode, both of which would be useful for the web. #10998
The spec makes the follow claim (https://drafts.csswg.org/css-color-hdr/#valdef-color-rec2100-pq):
I think this is at best very confusing. For example,
rec2100-linear
has the exact same peak white luminance, black luminance and white luminance but isn't said to have "absolute values". Are the values behaving differently? I don't believe they should.To take it even further, sRGB according to the "Controlling Dynamic Range" section (https://drafts.csswg.org/css-color-hdr/#controlling-dynamic-range) has (reference and peak) white defined at a luminance of 80 cd/m² and black at 0.2 cd/m². Is this not absolute? How is this different than the PQ absoluteness?
Personally, I don't think that PQ is any more or less absolute than any other fully color space + reference display + reference viewing environment combination, it just chose put the absolute luminance directly in the definition of the curve instead of implicitly over the definition of the reference display.
This has been abused to not properly anchor SDR and HDR content and, at the actual display, have HDR content at a fixed luminance, while SDR content will have different luminances depending on some brightness setting.