dylanraga / win11hdr-srgb-to-gamma2.2-icm

Transform Windows 11's virtual SDR-in-HDR curve from piecewise sRGB to Gamma 2.2
323 stars 11 forks source link

Ideal solution discussion #51

Open Riekopo opened 4 months ago

Riekopo commented 4 months ago

Note that it's common for applications to use an sRGB format swapchain, which leaves the linear to sRGB transform down to the driver. That driver may do a crude 2.2 ramp or may do a full sRGB transform with the small linear bit at the darkest shades. What the game does will vary based on the driver support at the time it was created. More modern games are likely using sRGB driver encodes, older games may just be doing a manual 2.2 gamma in their shaders. Having a toggle in Windows auto HDR feature to tag each executable with the color space you want Windows to interpret it as would be ideal. I don't think changing the default to assume a 2.2 gamma is correct here, especially for more modern titles using Vulkan or DX12.

Devlin1991 commented 3 months ago

I tested Last Epoch in SDR mode (TV set bt.1886 for dark room and 2.2 for bright room) vs HDR mode (TV set to ST2084) and the default windows SDR to HDR ramp more closely matched what I was seeing on the display, applying the sRGB to 2.2 EOTF correction via the AHK script resulted in very crushed shadows compared to what I see in HDR mode. I have a calibrated sRGB monitor I can ground-truth with tomorrow. I grabbed a Renderdoc capture of the game and I see that they render to a RGBA16F linear intermediate buffer, then they copy the output to the SRGB present surface making use of driver sRGB encode. image Thich would explain why applying the 2.2 correction results in the incorrect result on screen since you wont end up back in linear correctly. This is an example of a more modern app (it's DX11 Unity Engine) using the OS/Driver provided sRGB encoding correctly.

I like that the AHK script can be toggled btw it was very helpful to A:B test.

I do think we have an underlying issue, the one that MacOS tries to solve, windows does not always know the correct de-gamma function to apply to the apps SDR framebuffer to covert them into the 16-bit linear composition colorspace that windows uses for HDR/AdvancedColorDisplays. Some apps will correctly use the sRGB encoding that Microsoft specifies in their APIs, other's will decide to ignore it and intentionally write gamma 2.2 encoded data onto the surface, either because they don't know any better, or they are acting on the assumption that a lot of people's monitors despite being sRGB colorspace displays, use a default 2.2 gamma EOTF instead of the sRGB EOTF, at least in out of the box defaults.

I went digging into the DirectX code from windows to see what colorspaces they use, they use these tags on the app window to know how to decode it.

see: https://learn.microsoft.com/en-us/windows/win32/api/dxgicommon/ne-dxgicommon-dxgi_color_space_type?source=recommendations

For DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709, despite the G22 naming...., is very clear in the description that the app should have properly sRGB coded data in it's window memory.

DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709 This is intended to be implemented with sRGB gamma (linear segment + 2.4 power), which is approximately aligned with a gamma 2.2 curve.

This is usually used with 8 or 10 bit color channels.

This is the default colorspace that is used for apps that either don't specify a colorspace but use an RBGA8 format, or use the SRGB format surface explicitly which allows for driver linear to sRGB inverse EOTF encoding on write and sRGB EOTF decoding on read. This is what is set when you use GL_SRGB8_ALPHA8 for an OpenGL app, VK_FORMAT_R8G8B8A8_SRGB for a Vulkan app, or DXGI_FORMAT_R8G8B8A8_UNORM_SRGB for DirectX apps. OpenGL is explicit on the extension page about what encoding the driver should do https://registry.khronos.org/OpenGL/extensions/ARB/ARB_framebuffer_sRGB.txt,

"If FRAMEBUFFER_SRGB is enabled and the value of FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING for the framebuffer attachment corresponding to the destination buffer is SRGB (see section 6.1.3), the R, G, and B values after blending are converted into the non-linear sRGB color space by some approximation of the following:

         {  0.0,                          0         <= cl
         {  12.92 * c,                    0         <  cl < 0.0031308
    cs = {  1.055 * cl^0.41666 - 0.055,   0.0031308 <= cl < 1
         {  1.0,                                       cl >= 1

When in SDR mode on windows with no advanced color mode enabled, you get this

image

Which sadly means that apps are meant to query the colorspace of the display and act accordingly (see https://learn.microsoft.com/en-us/windows/win32/wcs/advanced-color-icc-profiles#legacy-windows-color-management-behavior), but the overwhelming majority don't. Your media player apps are likely going to just pass through the bt.1886 Rec709 tv shows as-is (which is fine since sRGB EOTF of bt.1886 results in a good viewing in brightly lit rooms), your sRGB encoded jpg files are going to output as-is, and the blend of sRGB and 2.2 encoded video games are going to also output as-is. It's essentially impossible to have all output from a PC to a display look correct, especially when those displays are not even setup to the sRGB standard a lot of the time.

bt.1886 image on a sRGB monitor in a dark room has less contrast, doesn't match intent. bt.1886 image on a sRGB monitor in a bright room results in a good image since the raised brightness helps punch through the ambient light to show details that would otherwise be lost in the haze. sRGB image on 2.2 monitor raises the shadows sRGB image on sRGB monitor results in a good image 2.2 image on 2.2 monitor results in a good image 2.2 image on sRGB monitor crushes the shadows

image

https://artoriuz.github.io/blog/gamma_correction.html

I think the tldr is that Windows HDR mode uses the DXGI_COLORSPACE tags to know how to de-gamma each app window back into linear space, but there are apps with bt.1886, sRGB, and 2.2 data that all use that same tag so it's a disaster of compatibility to get everything looking correct.

The dream solution would be a Microsoft provided per-executable override of the de-gamma function applied to their SDR window so we can tweak everything. A decent solution would be a Microsoft provided system global override of the de-gamma for DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709 windows so we can change it based on the app we care about most. A holdover solution that could be done with this project would be to update the AHK script to allow for toggling between a sRGB to 2.2 correction, sRGB to bt.1886 correction, and no correction. With some varied whitepoint nit targeting to allow it to work with different preferred SRB brightness values.