I find this formula to calculate the luminance of scRGB (R16G16B16A16F HDR) colors a bit confusing.
In DirectX 11 and 12, scRGB swapchain buffers have a luminance of 80 nits for a rgb value of 1 1 1, and a luminance of 0 for a value of 0 0 0. The display min and peak brightness don't come into play (if not within the game tonemapping).
Is this code maybe referring to the FreeSync version of scRGB HDR?
Edit: I think I now understand that this code is meant to normalize R16G16B16A16F buffer values into a 0-1 range, though in doing so, it clips any negative value, which are the ones that define colors out of sRGB/Rec.709 (e.g. go into Rec.2020 and beyond), so FSR FG seems to cut out HDR colors out of games, if it goes through this path.
I find this formula to calculate the luminance of scRGB (R16G16B16A16F HDR) colors a bit confusing. In DirectX 11 and 12, scRGB swapchain buffers have a luminance of 80 nits for a rgb value of 1 1 1, and a luminance of 0 for a value of 0 0 0. The display min and peak brightness don't come into play (if not within the game tonemapping). Is this code maybe referring to the FreeSync version of scRGB HDR?
Code: https://github.com/GPUOpen-LibrariesAndSDKs/FidelityFX-SDK/blob/a0632abf1350bb64c098573d84c42f053f053a6e/sdk/include/FidelityFX/gpu/frameinterpolation/ffx_frameinterpolation_common.h#L45C14-L45C30
Edit: I think I now understand that this code is meant to normalize R16G16B16A16F buffer values into a 0-1 range, though in doing so, it clips any negative value, which are the ones that define colors out of sRGB/Rec.709 (e.g. go into Rec.2020 and beyond), so FSR FG seems to cut out HDR colors out of games, if it goes through this path.