NVIDIAGameWorks / NRI

Low-level abstract render interface
MIT License
219 stars 25 forks source link

feat: add logic to sort swapchain formats #103

Closed pollend closed 1 month ago

pollend commented 1 month ago

this logic attempts to sort swapchains and pick the best swapchain given criteria for SwapChainFormat. I place the colorspace at the highest bit so its sorted by colorspace then by formats. formats have to be unorm where the bitwidth matches the criteria. code feels a bit overbuilt though.

does it matter if its bgra vs rgba depending on the driver? should it be more consistent in the selection I don't think it should matter?

https://github.com/NVIDIAGameWorks/NRI/issues/102

dzhdanNV commented 1 month ago

Thanks for improving this. I have a few questions:

  1. Sometimes you use:

    ((!formatProps.isSrgb) << 4) | ((surface.colorSpace == X) << 4);

    Sometimes:

    ((!formatProps.isSrgb) << 4) | ((surface.colorSpace == X) << 5);

    Why?

  2. case nri::SwapChainFormat::BT709_G10_16BIT: uses priority_BT709_G22_16BIT. Why?

  3. Should formatProps.isSrgb be used for 10 bits? 10 bit RGB doesn't have sRGB evil-twin format (but swap chain has supports gamma 2.2 transfer function).

P.S. Probably I understand why, but would like to hear your explanation.

pollend commented 1 month ago

Thanks for improving this. I have a few questions:

1. Sometimes you use:
((!formatProps.isSrgb) << 4) | ((surface.colorSpace == X) << 4);

Sometimes:

((!formatProps.isSrgb) << 4) | ((surface.colorSpace == X) << 5);

Why?

2. `case nri::SwapChainFormat::BT709_G10_16BIT:` uses `priority_BT709_G22_16BIT`. Why?

3. Should `formatProps.isSrgb` be used for `10 bits`? 10 bit RGB doesn't have `sRGB` evil-twin format (but swap chain has supports gamma 2.2 transfer function).

P.S. Probably I understand why, but would like to hear your explanation.

probably oversight on my end. I would drop the srgb check from 10bits and for the other formats that don't even have it? looks like only 8 bit formats have srgb. Im not familiar with what formats are available or what to expect from the surface? you can have a set of bits and a number so it would sort by flags then order by number? you can pack it all into one uint32.

would this have to be done for DirectX as well?

dzhdanNV commented 1 month ago

Sure. This logic seems GAPI irrelevant, so better share it between D3D11/D3D12/VK.

Could you please formulate your problem? I understand that my simple checks "are not enough", but I didn't get which set of surface formats you have and which surface format from this list is not selected being desired?

dzhdanNV commented 1 month ago

This logic seems GAPI irrelevant, so better share it between D3D11/D3D12/VK.

Ooops. My bad. Not needed for D3D, because there is always full support for those.

dzhdanNV commented 1 month ago

Summary:

pollend commented 1 month ago

Sure. This logic seems GAPI irrelevant, so better share it between D3D11/D3D12/VK.

Could you please formulate your problem? I understand that my simple checks "are not enough", but I didn't get which set of surface formats you have and which surface format from this list is not selected being desired?

// Color space:
//  - BT.709 - LDR https://en.wikipedia.org/wiki/Rec._709
//  - BT.2020 - HDR https://en.wikipedia.org/wiki/Rec._2020
// Transfer function:
//  - G10 - linear (gamma 1.0)
//  - G22 - sRGB (gamma ~2.2)
//  - G2084 - SMPTE ST.2084 (Perceptual Quantization)
// Bits per channel:
//  - 8, 10, 16 (float)

This is the criteria I'm thinking of given the enum. My thought would be colorspace followed by format. For the format I guess you would order by closest matching. would the order of the channels matter I wouldn't assume so since that is regardless of the shading code right? outputting to bgra is the same as rgba.

I haven't worked with these other colorspace formats so i'm not familiar with the expectation. you can always add more bits if you want to prioritize one colorspace then the other if one is closer then the other color space. reserve a part of the value for the colorspace so maybe like 4 bits so you write 1 for this colorspace then 2 for this colorspace so colorspace 1 would be first followed by colorspace 2.

still working on this project so my machine is the only sample at the moment. so the code here expected BGRA8 but mesa/AMD reports RGBA8. sorry, if i can't give you a really straight answer.

dzhdanNV commented 1 month ago

outputting to bgra is the same as rgba

Right. Unless you want to access the texture in a shader code. Not a big problem, since NRI returns swap chain format (in one way or another), so the app is aware.

Agree with the rest.

Do you plan to add more changes?

pollend commented 1 month ago

outputting to bgra is the same as rgba

Right. Unless you want to access the texture in a shader code. Not a big problem, since NRI returns swap chain format (in one way or another), so the app is aware.

Agree with the rest.

Do you plan to add more changes?

I'll quickly make the changes unless its easier on your end to just rework what i have.

pollend commented 1 month ago

outputting to bgra is the same as rgba

Right. Unless you want to access the texture in a shader code. Not a big problem, since NRI returns swap chain format (in one way or another), so the app is aware.

Agree with the rest.

Do you plan to add more changes?

the original code was a bit overbuilt I think these criteria should be a good start if it needs to be adjusted that can always happen later. for srgb scenario you don't need to handle it because if it reports SRGB then there must be an associated UNORM.

you might get an idea looking at this: https://vulkan.gpuinfo.org/listsurfaceformats.php

There are a lot of scenarios to consider ... I think its should be safe to start with these until something comes up.

dzhdanNV commented 1 month ago

Thanks Michael. This is a solid starting point.