ledoge / novideo_srgb

Calibrate monitors to sRGB or other color spaces on NVIDIA GPUs, based on EDID data or ICC profiles
GNU General Public License v3.0
988 stars 35 forks source link

Fantastic piece of sofware! #18

Open James-F2 opened 2 years ago

James-F2 commented 2 years ago

This allowed me to use all the picture settings in my monitor that are typically locked in sRGB Picture Mode. The Dithering option eliminated all banding issues I saw in v1.0, and the custom gamma curve is great.

Even though I have an i1 Display Pro, for simplicity I create a "Synthetic ICC" in DisplayCAL with my EDID xy coordinates and custom gamma curve so I can tweak it further in Novideo sRGB, or I can measure my actual xy coordinates with the i1 and then create a Synthetic ICC. Now I have control over my monitor without being limited by the sRGB Picture Mode that does not allow me to change anything, AND have accurate REC.709 gamut in games, browsing, etc.

Thank you again for this amazing software.

PS. An icon for the software will be great, it looks like a virus without an icon. haha

James-F2 commented 2 years ago

Another thing that would help me update the software is an "About" button with a link to this Github page, and remind me that LeDoge created this awesome tool.

ledoge commented 2 years ago

Thank you for the feedback! An "About" window seems like a good idea – I'll add that in the next release. Also, an icon would indeed be nice to have, but I lack the artistic ability to make a decent one...

James-F2 commented 2 years ago

Same icon as DWM LUT, just so it doesn't look suspicious.

James-F2 commented 2 years ago

@ledoge If I do a full greyscale calibration in DisplayCAL, will novideo_srgb take all 256 steps from the ICC for the gamma curve calculation or will it only use the 0,0,0 black point? In other words, will it 'fix' a poorly behaved display that clips several black shades?

ledoge commented 2 years ago

The tone response curves are used in two ways: the black point of the display is calculated using them, and the re-gamma LUT is populated with the inverse of the curves. So I think there shouldn't be any issues?

James-F2 commented 2 years ago

No issue, I just wanted to know if there is any benefit of Calibrating+Profiling or just Profiling is enough to get the black value. I'll rephrase, is the calculated gamma curve using ONLY the black point, or is it using all measured greyscale values?

ledoge commented 2 years ago

if there is any benefit of Calibrating+Profiling or just Profiling is enough to get the black value

Calibrating just adds vcgt data to the profile (which might result in a more accurate profile?), but novideo_srgb works perfectly fine with or without vcgt data. It's not relevant for the calculation of the black point.

is the calculated gamma curve using ONLY the black point, or is it using all measured greyscale values

There are two stages in the color side space conversion pipeline that affect gamma: The first 1D LUT is used to convert the pixel values to linear light. Its values are calculated only from the selected gamma (e.g. sRGB) and the black point. The second 1D LUT is used to convert from linear light to the right values for the display - this is where all of the measured grayscale values and, if present, the vcgt data come in.

James-F2 commented 2 years ago

I see, so there is a benefit in Profiling with a large amount of measured greyscale patches. Thank you!

James-F2 commented 2 years ago

One last thing for people who are using DisplayCAL and a calibration device. If you are using DisplayCAL to create an ICC make sure you select "XYZ LUT+Matrix" in the Profiling tab because Matrix only profiles will not produce a correct gamma curve in novideo_srgb. I use a custom Testchart with 256 grey patches (neutral patches) for maximum greyscale accuracy, and at least 400 color patches. Also, there is no need for Calibration, so leave everything there in "As Measured" like you would with MadVR calibration, except Interactive white point adjustment so you can bring your display closer to 6500K manually with R,G,B controls in your monitor OSD before Profiling.

I've attached a REC.709 Calibration Verification with the ICC applied in novideo_srgb, my monitor is in Standard wide gamut mode. Mind blowing results that compare to MadVR with its advanced 16bit processing and dithering, and it runs natively on the Nvidia GPU. Novideo_srgb deserves a lot more recognition.

Measurement Report 3.8.9.3 — Gigabyte M32Q @ 0, 0, 2560x1440 — 2022-03-08 13-19.html.zip

ledoge commented 2 years ago

Wow, those results are awesome! Thanks for sharing.

Matrix only profiles will not produce a correct gamma curve in novideo_srgb

They should work fine as long as you disable black point compensation.

James-F2 commented 2 years ago

Yeah, amazing that this level of accuracy can be applied natively at the GPU level (including advanced dithering), without the need of external hardware, expensive grading monitors, or specialized software.

It is my greatest pleasure using your software and software by people like Graeme Gill and Florian Höch. I am very grateful for this.

James

EDIT: I shamelessly forgot Madshi. ;facepalm;

aufkrawall commented 2 years ago

@James-F2 Would you mind sharing your testchart for DisplayCal? :)

James-F2 commented 2 years ago

Not at all.

It's nothing special. 256 neutral patches, and 7 for Single and Multidimensional color patches. I think 7 is way to much, 5 is more than enough, but I always do 256 neutral patches so there is minimal interpolation, especially in the dark region.

256greyscale testchart.zip

testchart

aufkrawall commented 2 years ago

Thanks, I'll give it a try and report back!

markanini commented 2 years ago

Curious if custom testcharts improve verification scores in this application. Before using novideo_rgb on my consumer grade WLED display I got the best scores with the default 175 patch set, somewhere on DisplayCAl forums Florian mentioned that testchart had some special optimization, I don't remember the details.

I create a "Synthetic ICC" in DisplayCAL with my EDID xy coordinates

How do you gather those from an existing profile?

ledoge commented 2 years ago

Curious if custom testcharts improve verification scores in this application.

I don't really know much about how ArgyllCMS works internally, but what I'd expect is that if you're generating an XYZ LUT profile, adding more color patches won't help but adding grayscale ones will (since novideo_srgb samples the extreme points and the grayscale axis only). With a 3 curves + matrix profile (make sure to disable black point compensation!) I think adding more grayscale and color patches should improve the average delta e.

James-F2 commented 2 years ago

since novideo_srgb samples the extreme points and the grayscale axis only

Ah, thank you, that clarified a lot. So there is no need for color patches at all, only greyscale and three 100% RGB values, I see.

I can confirm that when I added more neutral/grey patches to the Testchart it completely fixed all black crush I saw previously with default number of grey patches.

Thanks!

aufkrawall commented 2 years ago

Yes, interesting! I get lower gamma curve deviations when using 1x curve + matrix format with calibration speed set to medium vs. your custom chart, so I'll be sticking to that. 3x curves seems to give me worse color and gamma deviation, at least when using novideo_rgb.

James-F2 commented 2 years ago

I skip calibration and Profile only, it provides enough data and is much faster since there is no iterative calibration to minimize dE during measurement.

I just did a Curves+Matrix Profiling pass with 256 neutral patches and 10 RGB patches each, and the results are spectacular, might be even better than my previous XYZ LUT attempt.

Testchart and Verification.zip

aufkrawall commented 2 years ago

I skip calibration and Profile only, it provides enough data and is much faster since there is no iterative calibration to minimize dE during measurement.

Yeah, maybe my Spyder 4 is too shitty and is more dependent on those extensive calibration tests. Just a wild guess though. novideo_srgb clamp result with default 1x curve chart + medium speed calibration probably still wipes the floor with almost any monitor internal sRGB clamp: Measurement Report 3.8.9.3 — G27q-20 @ 0, 0, 2560x1440 — 2022-03-08 22-57.zip

Of course a driver/OS API for full blown 3D LUT correction would be the nonplusultra. I guess you'd find higher deviations when just clamping the gamut volume when running a really extensive validation test chart. But that's probably hard/impossible to notice during everyday usage, depending on the display's linearity.

James-F2 commented 2 years ago

I have to say that the ease of jumping between calibrated and accurate gamma curves without creating several different 3D LUTs for MadVR is absolutely delightful. I use sRGB, BT.1886 and Relative 2.35+0% Offset curves, and it has never been this easy with this level of accuracy before.

My collection of calibration test images confirm that everything is working perfectly and the resulting gamma curves are accurate, smooth (+dithering), and without black crush. Calibration Pictures.zip For me, the key was 256 neutral patches (all 8bit steps) with Curves+Matrix profiling.

I'm still mind blown by how quick, accurate, easy, and bug free this is, can't bother with 3D LUTs anymore.

erusyns commented 2 years ago

hey james, do you mind making a make a step by step on how you did this, Im having hard time understanding the process. I have a calibrated before using XYZ Lut + Matrix.

James-F2 commented 2 years ago

Here's a summary;

Skip calibration by setting everything to 'as measured' only do 'interactive display adjustment' for 6500K white point. Profile Curves+Matrix (black point compensation disabled) with 256 neutral patches and 2 single channel (color) patches or use this test chart. 256 Neutral Testchart.zip

For verification I use these settings, with novideo_srgb clamped to sRGB; verification

That's it, this is exactly what I did.

James-F2 commented 2 years ago

I am not sure how novideo_srgb is using color patches from the Matrix profile though, and whether there is a benefit of using more than just three 100% primary RGB patches.

erusyns commented 2 years ago

thank you james

ledoge commented 2 years ago

I am not sure how novideo_srgb is using color patches from the Matrix profile though, and whether there is a benefit of using more than just three 100% primary RGB patches.

AFAICT ArgyllCMS will use the color patches to generate an optimized matrix, so there might be a slight benefit for average delta Es at least.

Also, I made some experimental changes that should (assuming I did it properly) result in slightly better 8-bit gamma tracking. Please try this build and see if you get lower grayscale errors compared to the current version. The differences might be very minor though, so I'd suggest comparing it to a regular DisplayCAL verification without "Use simulation profile as display profile" as well to make sure the ICC profile is still accurate (or make a new one).

James-F2 commented 2 years ago

I Profiled a new ICC with 256 neutral and 3x25 color patches. Leaving the i1 Display Pro on screen after profiling, I see no difference in verification between release and test build, I made sure to refresh the "Clamped" button between .exe swaps.

markanini commented 2 years ago

Depending on the profile with the test version average Delta E dropped .1 - .05. I made the profiles last night.

Independently I compared the average delta E for the different profile types in DisplayCal: XYZLUT+matrix, Curves+matrix Single curve+matrix, Gamma+matrix, Single gamma+matrix. Going down the list the average Delta E increases, but the first two are very close. So far the test used Tone curve:As measured.

Setting Tone Curve to Gamma 2.2 and going from Calibration Speed:Vary fast to Slow the average Delta E increases! So my chosen profile uses Tone curve:As measured and Profile type:XYZLUT+matrix for the lowest Average delta E. My measurment device is a i1Display Pro.

I ran out of time to make a comparison were DisplayCal loads the profile instead, or using James-F2's test chart so one of you guys can help answer that one.

EDIT: Black point compensation was unchecked at all times.

James-F2 commented 2 years ago

I see an improvement in the smoothness of the greyscale with v2.7. There was a small 'kink' in the "16Bit Gradient.png" test pattern (I shared it above) with 2.6 and the test build (dithering enabled), now it's gone with 2.7 and greyscale is smooth as eggs.

ledoge commented 2 years ago

Very interesting – I thought that test build should have perfectly accurate grayscale, and with v2.7 I removed part of that to get better color accuracy with slightly worse grayscale accuracy, so I'm not sure why it would be even better now, but it's very nice that it is!

James-F2 commented 2 years ago

Maybe it was a rounding error before 2.7? When you say "Optimize de-gamma LUT for 8 bit values" what it was optimized for before that? My monitor is capable of 10bit gradation but I always work in 8bit because it is the most compatible with software, particularly when switching between OpenGL and Direct3D, 8bit is still the standard.

Here are some images of a Grey Ramp with the Test Build and v2.7. Notice with the Test Build the kink doesn't smooth out even with 8bit Dithering On. v2.7 with Dithering On is perfectly smooth, as without Clamping,

Test Build Dithering OFF Test Build Dithering Off

Test Build Dithering ON, several faint bright kinks are still visible. Test Build Dithering On

v2.7 Dithering OFF 2 7 Dithering Off

v2.7 Dithering ON, perfectly smooth in person, no repetition of 8bit grey lines, and new gamma tracking is spot on. 2 7 Dithering On

ledoge commented 2 years ago

When you say "Optimize de-gamma LUT for 8 bit values" what it was optimized for before that?

Before that, I just filled the de-gamma/re-gamma LUTs "normally", which meant that 10 bit values would be mapped exactly, but 8 bit values would not as they end up being interpolated between two entries in the de-gamma LUT. The optimization I did is identifying the entries that would be used for interpolation and putting the correct values for 8 bit inputs in them. Since there are pairs of entries with identical values now, the bit depth is effectively reduced to 9 bits, so there's probably a bit of banding with 10 bit inputs.

James-F2 commented 2 years ago

For the record, when I switch to 10bit in nvidia control panel and use 10bit dithering, the bright kink is still present in v2.6 and the test build.

Shouldn't 8bit values be aligned in 10bit by default without interpolation?

ledoge commented 2 years ago

For the record, when I switch to 10bit in nvidia control panel and use 10bit dithering, the bright kink is still present in v2.6 and the test build.

From my understanding, since the desktop is still being rendered in 8 bits, the values going into (and coming out of) the color space conversion pipeline are the same, so this is expected. The only difference is that the values get converted to 10 instead of 8 bits before being sent to the display.

Shouldn't 8bit values be aligned in 10bit by default without interpolation?

That does seem like it should work, but unfortunately it doesn't. 1023/255 is 4.01176.... You're only getting nicely aligned conversions if you go all the way up to 16 bits, as 65535/255 = 257 (IIRC the Windows VCGT loader produces slightly wrong results because it assumes that this number is 256).

James-F2 commented 2 years ago

Interesting, I thought it should be 1024/256, zero included. When novideo_srgb is disabled I don't see any change in greyscale when my display is set to 8 or 10 bit, the input is of course always 8bit from windows. Maybe nvidia driver is doing some kind of conversion form 8bit input to 10bit with dithering when the display is set to 10bit?

8bit is still the standard for most if not all non-color-aware applications and rounding to 8bit was a great decision. If full 10bit pipeline is still desirable for the rare applications that outputs real 0-1023 values (like madVR in exclusive mode or Photoshop) and bypasses DWM, maybe you can make it switchable?

In any case, I believe 8bit should be the default and 10bit processing can be a switchable option if needed. Warning though, people will enable 10bit even if it makes grey scale less accurate for 8bit input, same with dithering, because 10 is more than 8... DERP. The grey ramp test pattern helps a lot to visually verify things.

v2.7 made Novideo_sRGB more accurate than it ever was.

ledoge commented 2 years ago

Interesting, I thought it should be 1024/256, zero included.

Yeah, it's kinda counterintuitive. Try thinking about it like this: the 8-bit value 0 has to be mapped to the 10-bit value 0 (the minimum value), and the 8-bit value 255 has to be mapped to the 10-bit value 1023 (the maximum value). The only (linear) operation that satisfies these properties is multiplication by 1023/255.

Maybe nvidia driver is doing some kind of conversion form 8bit input to 10bit with dithering when the display is set to 10bit?

I'm not sure, but it would probably make more sense than truncating to the nearest 10 bit value.

In any case, I believe 8bit should be the default and 10bit processing can be a switchable option if needed. Warning though, people will enable 10bit even if it makes grey scale less accurate for 8bit input, same with dithering, because 10 is more than 8... DERP.

I definitely agree. It's certainly a niche use case, but I'll try to think of a good label so that I can add it as an option without everyone enabling it when they shouldn't.

ledoge commented 2 years ago

Just pushed a release with that option added – I think the label is ambiguous and "scary" enough that people won't just enable it randomly.

James-F2 commented 2 years ago

Testing it now.

I'm not sure if the 10bit switch is working, I don't see a change from 8bit.

ledoge commented 2 years ago

Weird, I double checked using the debugger and the switch is definitely working as intended.

James-F2 commented 2 years ago

Could be, I'm just not seeing the same kink as in the 2.6 or test build.

Even with dithering disabled, I see no change.

James-F2 commented 2 years ago

Wait... I'm sorry, I do see a change but it is very tiny. The change is extremely small grey hue shift in the kinks when dithering is disabled, barely visible unless zoomed in 2000% on the kink.

ledoge commented 2 years ago

I have absolutely no idea why there'd be a kink with v2.6 but not with v2.8 and the optimization disabled, as the math should be 100% the same... Good to know that it's at least doing something though.

James-F2 commented 2 years ago

Yeah, the math is exactly the same, except now it samples a ready variable instead of an equation. Maybe the compiler prefers filling arrays form ready variables instead of calculating them on the spot... timing?... just guessing.

Anyway, I'd like to think that the "new" 10bit mode in 2.8 is more accurate, because I do not think displaying 10bit in 8bit should have that clearly visible errors as I saw in 2.6, I think it is a very welcome chance fix.

aufkrawall commented 2 years ago

Can confirm that the new 8 bit optimization helps. It's very noticeable with a black - orange 8 bit gradient when dithering is disabled. And also with temporal dither to 8 bit, there is still a bit of an improvement in the darker shades. Nice! :)

ledoge commented 2 years ago

@James-F2 I compared v2.6 vs v2.8 with the optimization disabled by using a separate program to dump the LUT contents, and they're 100% identical. Are you sure you were using the same settings and the same profile when comparing them?

James-F2 commented 2 years ago

Not in 2.6, but the Test Build you posted in this thread. This post: https://github.com/ledoge/novideo_srgb/issues/18#issuecomment-1065651500

With this Test Build I see a bright kink, is it not identical to 2.6?

EDIT: v2.6 looks fine, but has small 10bit on 8bit error you fixed in v2.7.

EDIT2: Sorry for the confusion, everything is fine, I thought the Test Build and v2.6 were identical in greyscale processing.

aufkrawall commented 2 years ago

@ledoge I suppose this is a driver bug, but did you notice that active clamp makes gamma of the hardware mouse cursor too bright? This is hard to notice with the Windows default cursor theme, but when e.g. games make use of a custom hardware cursor theme, this is more distinct.

ledoge commented 2 years ago

Interesting, haven't noticed that myself, but I did notice what I believe to be a similar issue with HDR. If you're in SDR mode and have an HDR application (e.g. the DisplayHDR Test app) fullscreened, gamma is way too dark.

Is your cursor entirely blown out or just a bit too bright? Would be interesting to test if this also happens when applying a novideo_srgb calibration using something like the sRGB profile, as that may put the regamma LUT in a similar state as with the color space conversion pipeline not being used.

aufkrawall commented 2 years ago

It's quite substantially brighter, yes. It also happens with sRGB gamma curve. It seems to happen after dragging a window a bit (which temporarily activates software cursor until dropping the window), but moving the cursor over resize borders can reset it back to normal (probably due to temporary theme change).

James-F2 commented 2 years ago

I currently favor Relative Gamma 2.2 with 0% Black Output Offset, this is the closest an LCD can get to pure 2.2 gamma without crushing black shades, I think it looks better than sRGB curve for everything, including Movies, Games, and Internet.

Same settings as above for Gamma 2.4 for Movies, which I prefer over BT.1886 that elevates black shades like sRGB. In madVR you can increase gamma to 2.4 with the same results if you start with Relative 2.2 0% in novideo_srgb.

As for Dithering, I'm using SpatialDynamic2x2, after careful comparison of all options in 6bit I found this gives the smoothest results with the least amount of noise and no visible dithering pattern.

A tip regarding calibration; Plug the measuring device USB and let it warm up on the table from 30 minutes, only put it on the screen just before calibrating, take it off the screen after calibrating. In other words don't let it warm up or receive constant light. I find it results in a better calibration and more accurate gamma tracking.

EDIT: Another thing I do in nvidia control panel, under the "Adjust desktop color settings" tab I enable "Override to reference mode", this disabled any color space processing just in case.