Closed clsid2 closed 7 months ago
I have a question. Has anyone tried changing DISPLAY_LUMINANCE_PEAK on different videos and displays? What was the result? Within what limits does it make sense to change this parameter?
Yes, I have tested different values for DISPLAY_LUMINANCE_PEAK. A lower value will give a brighter result.
What value looks best depends on the monitor and its configuration. I for example have set my monitor to a very low brightness because I prefer that for normal work. But with such setting, the HDR tonemapping will be too dark with current MPCVR. Projectors also often have low peak luminance (<=80). But there also are people with modern monitors that can do 200 nits if brightness is set very high.
50-200 seems good initial range to me. But range can always be increased if users need it. I think many users will be happy if this option will be added. Then they can adjust according to your monitor brightness and preference.
Using MaxCLL would need some testing first. If a file uses a lower value for that (like 4000 instead of 10000) then result would get darker. So in theory such files would be too bright right now (on actual 125 nit display).
There is no need to leave this issue. If you have the desire/opportunity, make either a patch or a pull request.
@clsid2 Do you intend to make this patch? It seems like it could really beneficial
Also @adipose I see you make patches in the last few days. Is this something you think you can help with?
50-200 seems good initial range to me. But range can always be increased if users need it.
You can take screenshots for the value 50 and 200 using samples from here - https://kodi.wiki/view/Samples I suggest downloading one of these files: HDR 10-bit HEVC 59.94fps (in MP4, Camp by Sony) VP9 Profile 2 HDR 24fps The Redwoods (in MKV) (thanks @wesk05) VP9 Profile 2 HDR 59.94fps The World in HDR (in MKV) (thanks @wesk05)
And the second question. Why don't you use the option to adjust brightness and contrast?
Adjusting brightness and contrast is just wrong solution.
50
125
200
125 looks best on my display.
The whole point is that it depends on the display what is best.
I will make a patch when I have time.
i don't know how the tone mapping algorithm works in detail in mpcVR but for accurate tone mapping you need to know the target gamma and brightness.
in tone mapping there are 2 important part in it the "linear part" that's from 0 brightness to a knee point that depends on the peak brightness. this part is not compressed at all. that's why you can not just adjust the brightness because that will compress everything making the results wrong. same issue with just changing gamma.
after the knee point this part is actually tone mapped and compressed rolled of and what so ever only that part should change when the peak brightness is changed.
exposing peak brightness and if possible gamma will help quite a bit. https://github.com/Aleksoid1978/VideoRenderer/blob/master/Shaders/d3d11/ps_convert_pq_to_sdr.hlsl color = pow(color, 1.0 / 2.2);
exposing that 2.2 should do the trick. BTW. that's not sRGB.
What gamma value to use when if it would be configurable?
what ever you have calibrated your device to that's why it should be configurable there is no one true number. if not defined madVR guesses 2.2 so does mpv. so the 2.2 is what other renderer use which was a "good" start.
the spec is bt1886 today but that's quite out of scope because then you have to define the brightness and the black point too so i suggest to ignore bt1886 for now and it is not that i ever heard someone calibrate to it and stick to it using an LCD... bt1886 is 2.4 on an OLED for example and before bt1886 the "specification" was 2.4. so many will change this to 2.4 or 2.22 for sRGB approximation. some PJ user like to use 2.35.
gamma is the wild west everything between 1.8-2.6 can be used.
BT.1886 has an exponent of 2.4. But then a decoding gamma of 2.2 should be used: https://www.image-engineering.de/library/technotes/714-color-spaces-rec-709-vs-srgb
not this again :-)
bt 709 doesn't have a transfer function it only has a oetf which we should ignore. the spec it self tells us not to use the oetf as an eotf. don't believe me believe the specification: https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.709-6-201506-I!!PDF-E.pdf In typical production practice the encoding function of image sources is adjusted so that the final picture has the desired look, as viewed on a reference monitor having the reference decoding function of Recommendation ITU-R BT.1886, in the reference viewing environment defined in Recommendation ITU-R BT.2035.
we are supposed to use the production transfer function and that was everything but not that one.
the "gamma" in this link are not used: https://www.image-engineering.de/library/technotes/714-color-spaces-rec-709-vs-srgb neither sRGB nor the bt709 "gamma" is a video gamma. EOTF gamma was not defined in bt 709 the first definition is BT1886: https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.1886-0-201103-I!!PDF-E.pdf and not only the exponent is 2.4 if black is 0 in the formula it will results in true gamma 2.4. the exponent does not matter because sRGB is also exponent of 2.4 but the ~gamma is 2.22 (cheap mans sRGB) and bt709 "gamma" is exponent 2.2 but the resulting gamma is ~1.961. sRGB is used but not for video Microsoft thinks everything is sRGB. for SDR -> HDR Microsoft will use sRGB for SDR resulting in a wrong transfer function for video and a "correct" one for applications.
and the most important part here is we don't care about any of these we care about the real gamma of a display. and these are typically bt 1886, gamma 2.2-2.4 or sRGB. even if i wanted to use bt 709 "gamma" i will have a hard time to do so because even the very powerful displaycal does not support that out of the box and why would it do that.
want another proof from me why you should never use 1.961 bt 709. take mpv use GPU-next make a screenshoot press s or S something. it will write 1.961 in the meta data and microsoft believes that in the photos app and tries to fix that resulting in quite the mess.
sorry i was not the person that forgot to define a EOTF when bt 709 was created. i'm skipping a lot of stuff here like light condition in a room where under studio lightning 2.2 was used but the cinema presentation was using 2.4 in a dark room to look the same or how you can end up with 1.8 in a day light flooded room and what not. or another fun fact sRGB has a brightness of 80 needs ambient light with a white point of D50 while the white point is D65 the original sRGB is truncated resulting in a none continuous transfer function and banding. just madness.
I am trying to pass the values as shaders constants, but struggling to get it to work. Edit: got it working for DX9.
In CDX9VideoProcessor::ConvertColorPass
:
float fConstDataHDR[][4] = {
{10000.0f / 125.0f, 0.0f, 0.0f, 0.0f}
};
hr = m_pD3DDevEx->SetPixelShaderConstantF(0, (float*)&m_PSConvColorData.Constants, sizeof(m_PSConvColorData.Constants) / sizeof(float[4]));
hr = m_pD3DDevEx->SetPixelShaderConstantF(4, (float*)&m_DoviReshapePolyCurves, sizeof(m_DoviReshapePolyCurves) / sizeof(float[4]));
hr = m_pD3DDevEx->SetPixelShaderConstantF(50, (float*)fConstDataHDR, sizeof(fConstDataHDR) / sizeof(float[4]));
(note: I think it should be sizeof(float[4])
since documentation says it is the number of registers, not number of floats.)
Same addition in CDX9VideoProcessor::Process
before call to m_pD3DDevEx->SetPixelShader(m_pPSCorrection)
And then in shader use:
float hdrfactor : register(c50);
Patch that adds "TargetNits" option for DX9: https://github.com/clsid2/MPCVideoRenderer/commit/06f229c8ec331295f959c967c24f8ed2dbd26724
What a half-patch - no DX11, no UI.
It's work only for VP or shaders too ? UI - must apply changes "on fly"
sounds good enough to me so user can test if the algorithm can handle all of this properly in the first place.
It is work in progress. For DX9 it works for both VP and Shader case. Works good to customize tone mapping brightness.
I have no yet touched DX11 code. I am not very familiar with that yet, so any help is appreciated.
I am not messing with UI option yet, because you probably want to decide yourself how and where to place it in options dialog.
Current patch maintains existing behavior by default (125 nits) and allows user to manually tweak value. So already usable if you want to.
DX11 patch: https://github.com/clsid2/MPCVideoRenderer/commit/3c065ed75eb71728a5302e3aa50fb1feddbfe4e9
It is not working yet. Need help fixing it.
Debug:)
Did you read my comment about last parameter of SetPixelShaderConstantF
? You are currently using wrong size value.
You are very helpful...
My quick option if using a D3D11 video processor. LuminanceScale_1.diff.txt
Thanks.
Do I understand code correctly that DX11 does not do any HDR tonemapping if VP is disabled/unavailable?
No - if no VP, must use Shaders.
I know it then uses Shaders for color correction and resizing. But it does not seem to use tone mapping shader?
When use Shaders - it's create from string and compile, see https://github.com/Aleksoid1978/VideoRenderer/blob/master/Source/Shaders.cpp#L555
Thanks. Got it working with shader processing as well. https://github.com/clsid2/MPCVideoRenderer/commits/mods/
targetnits_option_patch.zip (I can merge them into one and make PR if needed)
Only thing missing is GUI option. Can you guys do that?
UI it's no so easy, it's must change on the fly :)
I think that config changes already works, similar to other options. I added code there.
Edit: I have added GUI elements. Not functional yet. Will do that later.
GUI option is now functional.
Some problem with save value when filter active, when playback.
Hmm, the control sometimes resets to 100 on opening. Fixed that now.
Good !
Yes. Any comments on variable names? Then I can make those changes as well.
I can for example change iTargetNits
to iDisplayNits
.
And hdrfactor
in DX9 shader code to LuminanceScale
as in DX11.
Or whatever names you prefer.
1 - Maybe better "SDR display nits" ? so variable "iTargetNits" -> "iSDRDisplayNits" 2 - Make available or not available depending on the "Convert to SDR" option.
About "hdrfactor -> LuminanceScale" - ok ).
When you done - can you merge all code into one patch and upload ?
@clsid2 Do you plan to also expose the gamma values like discussed above, or do you find it not worthy of the time investment?
And also, do you plan to expose the SRC_LUMINANCE_PEAK to the relevant value, if available? Because the current patch still hard codes the value, and it might be beneficial to get it straight from the HDR metadata, when available
No plans for that.
to bad no one is calibrating to gamma 2.2 these days.
For using MaxCLL I need a few good sample files. But I got no reaction from anyone when asked on Doom9, so I guess nobody cares, and so I won't spend my time on that now. I don't have a collection of HDR movies.
For using MaxCLL I need a few good sample files. But I got no reaction from anyone when asked on Doom9, so I guess nobody cares, and so I won't spend my time on that now. I don't have a collection of HDR movies.
@mightyhuhn do you have/can gather some samples of HDR content for this?
https://kodi.wiki/view/Samples
Some HDR samples are here.
You guys can test yourselves. I am not going to hunt through those files looking for proper sample with good MaxCLL value.
For files with low MaxCLL, does increasing Display Nits (to 400) make the tonemapped result look better? If not, using MaxCLL is pointless. Some people on Doom9 were expecting the exact opposite of what would happen in reality with the current algorithm.
(increasing display nits has similar effect as lowering SRC_LUMINANCE_PEAK since shader uses factor SRC_LUMINANCE_PEAK/DisplayNits
)
Edit: tested with a 1000nits file and it looks total crap, so I won't be implementing this.
to bad no one is calibrating to gamma 2.2 these days.
So what values are being used?
Also, I think in general exposing the gamma values could be more beneficial, however, I'm not a coder, and I have no idea how difficult or time consuming is to create such a patch, and if the benefits are noticeable enough.
@clsid2 @adipose do you think exposing gamma values is even worth the trouble?
For using MaxCLL I need a few good sample files. But I got no reaction from anyone when asked on Doom9, so I guess nobody cares, and so I won't spend my time on that now. I don't have a collection of HDR movies.
@mightyhuhn do you have/can gather some samples of HDR content for this?
my HDR knowledge is very limited(technically only). i just use the jav test sample he created for testing. the doom9 guys are not really active anymore they are on AVS. the number there is also shrinking heavily. not sure if they can be moved to look at static tone mapping.
to bad no one is calibrating to gamma 2.2 these days.
So what values are being used?
Also, I think in general exposing the gamma values could be more beneficial, however, I'm not a coder, and I have no idea how difficult or time consuming is to create such a patch, and if the benefits are noticeable enough.
@clsid2 @adipose do you think exposing gamma values is even worth the trouble?
bt 1886 is the first actual spec. 2.4 is be used for 20 years now. if you haven't calibrated your device just ignore this.
try madVR the idea is the same as the calibration tab. the gamma setting under "this display is already calibrated" will affect HDR tone mapping and does nothing to SDR.
after nots option is released i most likely just test it my self just compile a couple of version myself.
For using MaxCLL I need a few good sample files. But I got no reaction from anyone when asked on Doom9, so I guess nobody cares, and so I won't spend my time on that now. I don't have a collection of HDR movies.
@mightyhuhn do you have/can gather some samples of HDR content for this?
my HDR knowledge is very limited(technically only). i just use the jav test sample he created for testing. the doom9 guys are not really active anymore they are on AVS. the number there is also shrinking heavily. not sure if they can be moved to look at static tone mapping.
to bad no one is calibrating to gamma 2.2 these days.
So what values are being used?
Also, I think in general exposing the gamma values could be more beneficial, however, I'm not a coder, and I have no idea how difficult or time consuming is to create such a patch, and if the benefits are noticeable enough.
@clsid2 @adipose do you think exposing gamma values is even worth the trouble?
bt 1886 is the first actual spec. 2.4 is be used for 20 years now. if you haven't calibrated your device just ignore this.
try madVR the idea is the same as the calibration tab. the gamma setting under "this display is already calibrated" will affect HDR tone mapping and does nothing to SDR.
after nots option is released i most likely just test it my self just compile a couple of version myself.
If you want to test the Nits patch, the latest K-lite update that got released today already includes it.
If anyone is interested in checking HDR to SDR taking into account HDR metadata, check in Kodi with the HABLE method. If something happens, you can try to transfer their developments into the project.
A common problem with current tonemapping is that the result is too dark on some monitors.
A solution would be to make
DISPLAY_LUMINANCE_PEAK
an input parameter infixconvert_pq_to_sdr.hlsl
. You can use SetPixelShaderConstantF to pass such parameter value to the shader. The display luminance can then be an option in the GUI. I would suggest naming it "target nits" like in Madvr. With value range between 50 and 200.Maybe also make
SRC_LUMINANCE_PEAK
an input parameter. Default to 10000 like it is now. But perhaps useMaxCLL
from HDR metadata instead when known?Please don't close. I can do it in future if needed. But you can do it faster, as it shouldn't be difficult.