ledoge / novideo_srgb

Calibrate monitors to sRGB or other color spaces on NVIDIA GPUs, based on EDID data or ICC profiles
GNU General Public License v3.0
988 stars 35 forks source link

Is the processing in 16bit? #6

Closed James-F2 closed 2 years ago

James-F2 commented 3 years ago

I'm seeing banding in the dark region when enabling gamut clamping. I'm wondering whether the internal GPU process is in 16bit before it outputs 8bit to display?

Attached ICC from EDID and 16bit grey ramp test pattern. 16Bit Gradient.zip Gigabyte M32Q.zip

Disabled: *Left most shade is 0,0,0 black, pattern zoomed 500%, camera overexposed, IPS glow. Disabled

Enabled 1: Enabled 1

Enabled 2: Enabled 2

ledoge commented 3 years ago

No idea, sorry. The only public information about this stuff that I could find is here (slides 40 - 42), but it doesn't say anything about the precision.

aufkrawall commented 2 years ago

Internal precision seems to be sufficient, but you need to manually enable dithering with Nvidia and 8 bit output (enabled by default for 10 bit output and AMD also 8 bit output). CalibrationTools offers it to do it via UI by calling NVAPI: https://bitbucket.org/CalibrationTools/calibration-tools/downloads/

@ledoge At this occasion: The 1D LUT calibration part probably is out of scope for this project, right? CalibrationTools is a great program, though unfortunately Windows still seems to break color precision of it after system suspend etc.