w3c / web-wcg-hdr-workshop

A proposed W3C virtual workshop on Wide Color Gamut and High Dynamic Range for the Wev
9 stars 12 forks source link

HDR Proofing #7

Open svgeesus opened 3 years ago

svgeesus commented 3 years ago

Dmitry Kazakov raises the point in his talk that for SDR, ICC-based proofing is possible but for HDR, because the tone-mapping step is often not exposed by displays, it is not possible to proof what content will look like on another display (for example, one with a lower peak luminance).

Instead, content must be viewed on a range of different physical displays of differing capabilities.

ppaalanen commented 3 years ago

Yeah, monitors doing arbitrary tone-mapping we have no control of is a concern also for Wayland CM & HDR, because we are trying to combine traditional color management and HDR features into a single display system.

There is a theory that I heard first from @swick I think, that if you send the monitor a HDR metadata block (Dynamic Range and Mastering Infoframe, CTA-861-G) with the values corresponding exactly to what the monitor advertises in EDID, and use the "Traditional gamma - HDR luminance range" EOTF setting in it, then just maybe the monitor wouldn't mess up too bad. Some drivers on Linux seem to allow this though I'm not sure anyone actually tried it yet, and we don't have the measuring equipment to verify either. We've been calling this mode "native HDR", as opposed to "standard HDR" e.g. BT.2020/PQ which seems much more susceptible to arbitrary tone mapping in monitors.

CTA-861-G has other interesting bits like "IT Content Type" too.

Of course, monitors could do their own thing regardless, so end users who really care will need to have measuring equipment to check on the monitor behavior.

Still, our goal with the Weston implementation is to aim for the "native HDR" mode foremost, and have "standard HDR" for monitors and TVs that do not support "native HDR". That way the display server (Weston) is responsible for tone-mapping when possible, and Wayland will also allow applications to do their own tone-mapping if they want by providing the monitor characterization. If an application claims that its content is already in the same color space and dynamic range as the monitor, then the display server should have no need to mangle the (opaque) pixels any further.

If all that works, then HDR proofing should be possible too, I suppose. Well, between monitors driven in "native HDR" mode.

I guess "standard HDR" mode might not be a completely lost case, if monitors handle the HDR metadata where we could say that our content already matches exactly the monitor gamut and dynamic range. In that case we just lose some precision in color values when we can't use the full pixel value range as it would go out of gamut and/or dynamic range.

Fingers crossed.

dimula73 commented 3 years ago

There is a theory that I heard first from @swick I think, that if you send the monitor a HDR metadata block (Dynamic Range and Mastering Infoframe, CTA-861-G) with the values corresponding exactly to what the monitor advertises in EDID

Well, I have a feeling that the problem is a bit more convoluted. As far as I understand [1] DisplayHDR specification allows the display to highlight or dim several areas on the display to do "fake" HDR. So, basically, when you paint the sun in the center of the screen, the backlight for this area is turned to 100%, while the rest of the screen is either dimmed (for cheap displays) or kept untouched. That is, the granularity of the lightness control is not per-pixel, but per-patch. And the patch size may be several centimeters in size.

Though this effect will surely affect proofing, I don't think it can make it impossible.

[1] - I have seen that on a real display, but I cannot find that in the specification right now. If someone has a link, please add :)