Open Restoration2 opened 2 years ago
Do these gamma shifts also happen with a linear VCGT, no novideo_srgb calibration active, the monitor being set to 8 bit full range RGB, and the dither bit depth being 8 bits (i.e. when the dither shouldn't really be doing anything)? If that is the case: Try creating the ICC profile with dithering enabled, and see if that gives you a better end result (in theory there shouldn't be a dip then). If not, it should still be possible to compensate for when generating the lookup table, but I'd need actual pixel data captured from the GPU output to properly investigate how exactly (and maybe also why) the gamma gets modified like that.
Hi, I've noticed that when dithering is enabled that there is a dip and the gamma is lower than expected at exactly the 90% mark on a DisplayCal verification report, I've noticed people mention this before with the nvidia dithering hack on their forums for calibration luts, and i've noticed it on other verification reports too. (dithering method doesn't matter, same results whether temporal or other)
2.2 relative with 100% black output offset dithering on:
2.2 relative with 100% black output offset dithering off:
BT.1886 dithering on:
BT.1886 dithering off:
Literally everything else is better with dithering on (gamma tracking besides 90% and gray balance) but no matter what gamma I try it seems to always have that dip at 90% when dithering is on even if the rest is perfect. I've tried other gamma values with dithering disabled and even though the accuracy is worse (espec at 95%+), it always hits 90% correctly whereas with dithering enabled it will droop again at 90%. Seems like a glitch with nvidia dithering and I feel like it affects perceived contrast in the lighter areas even though it is very slight.