Closed dechamps closed 2 years ago
Yes, I did not initially implement this in vo:gpu as I was focused on getting the general PQ stuff done (also I did not see with the test content I was utilizing in development major issues with this information missing on my specific screen).
So yes, vo=gpu with the d3d11 back-end will only set the swap chain to PQ+BT.2020 mode in 10 bit. Not pass through the HDR10 metadata.
As I have essentially rewritten nev's changes to libplacebo's d3d11 module regarding swap chain color space etc, I have a pretty good grasp on how the API works, but I think only the max brightness bit is available through the mpv interfaces right now in the location the metadata is being currently set (at swap chain creation). With that merge request's changes vo=gpu-next with --target-colorspace-hint=yes
should set both the swap chain color space as well as the HDR10 metadata if the decoded frame has that information.
Thanks. A quick search reveals that the relevant API is IDXGISwapChain4::SetHDRMetaData
. To be clear, are you saying mpv currently doesn't call this API but will in the future?
I eagerly await these updates!
Out of curiosity what should i see in the test-pattern _(White900-4000nits-MaxCLL-4000-) on a 1000 nits TV, with HDR on in Win11? For all players (mpv, mpc-be, JRMC29) including the TV's media/Plex app, i see the white boxes merging at around 1500 nits. Yet via the Win11 Movies & TV app i see merging at around 3600 nits?
Out of curiosity what should i see in the test-pattern (White_900-4000nits-MaxCLL-4000-) on a 1000 nits TV, with HDR on in Win11?
If the full chain is working properly, i.e. assuming that the metadata is preserved throughout AND your display's HDR tone mapping is any good, you should in theory see the full range or most of it. For example, on my setup (Windows 11 21H2 22000.613 HDR ON, 3080 Ti driver 512.15, LG G1 firmware 03.20.16), I can see the difference up until 3500 nits or so using VLC and madVR, though they need to be full screen and I need to wait for about 10 seconds or so as the TV "fades" into the new metadata - it doesn't switch instantly. If I force the TV to assume 1000 nits using a hidden LG menu, the whole ramp disappears, which is exactly what one would expect and confirms that the TV is tone mapping at 4000 nits peak otherwise.
I just tested using the Windows 11 "Films & TV" app and the result seems incorrect - it's blending in at about 2500 nits or so.
Unfortunately I don't know of an easier way to diagnose HDR metadata issues - LG TVs will not tell you which peak luminance it's currently receiving over the HDMI link (not even in hidden service menus) and Windows/nvidia will not tell you what metadata it's sending, either. So basically one has no choice but to do this blind and take a guess at what's happening based on how test signals look like. AFAIK the only way to truly know what's actually happening over the HDMI connection is to use an analyser like Dr HDMI but that's a lot of money just to confirm a single number :/
ah ok i see, yet VLC looks similar to mpv/JRMC for me, merges at around 1500 nits.
If i set target-peak = 1000
in mpv i see the full range (no merging until last box) so i assume this is more "correct", since even my native TV's apps will merge at 1500 nits, so my TV has bad tonemapping?
I assume for HDR10 content that's not a big issue, since most have static MaxCCL = 1000, yet Dolby Vision files usually target 4000 nits from what i have seen?
So should i set target-peak
manually and let mpv tonemap or does this also have some drawbacks?
PS: I also noticed that all Win11 player's will only display the black-level-v1
test up until 80, while the native TV's apps display until 64. Meaning i can only see the last 4 boxes (66-76) via TV apps. Any tip to fix this for Win11?
yet VLC looks similar to mpv/JRMC for me, merges at around 1500 nits.
Then your setup is not tone mapping properly. Either because the component responsible for tone mapping (i.e. your display, if you're not tone mapping in the player) is bad, or because it's not getting correct metadata for some reason. Or there is something else going on which is unknowable without having direct access to your particular setup. Unfortunately this can be pretty tricky to troubleshoot given how opaque the various components tend to be in terms of diagnostics. Sadly one often find themselves fumbling in the dark when it comes to this stuff.
If i set
target-peak = 1000
in mpv i see the full range (no merging until last box) so i assume this is more "correct", since even my native TV's apps will merge at 1500 nits, so my TV has bad tonemapping?
I'm honestly not very familiar with mpv (I only started using it recently), so I'm not 100% sure what the --target-peak=1000
option would do. Looking at the docs it seems like that would affect mpv's own tonemapping? But then that means you are tone mapping in mpv, not in your display? Does passing --target-peak
automatically implies enabling tone mapping in mpv? Or did you enable mpv tone mapping manually as well? (Maybe someone more familiar with these options could clarify.)
To clarify: the context around my original bug report is a setup in which I want the display (i.e. my LG TV) to handle the entire tone mapping process - I don't want mpv to do any tone mapping and I want both pixel values and metadata to be passed through to the HDMI port unchanged.
I assume for HDR10 content that's not a big issue, since most have static MaxCCL = 1000, yet Dolby Vision files usually target 4000 nits from what i have seen?
According to this (admittedly old) post, 4000-nit HDR10 movies are actually quite common:
So should i set
target-peak
manually and let mpv tonemap or does this also have some drawbacks?
If your display's tone mapping is bad/broken, then it might make sense to tone map in mpv instead. But the problem is, if you do that, then you have to make sure that your display's tone mapping is completely turned off (i.e. hard clipping) - you don't want tone mapping to be done twice, the results will be incorrect/bad. The way to do that depends on your particular setup and display. Honestly I wouldn't attempt something like this without access to a color meter to double-check the resulting response curve. Especially since this might not just affect luminance, but color gamut as well.
Just noticed you use the 0.34.0 version, so what vo
are you using? Metadata pass-through should only work via gpu-next
and --target-colorspace-hint
. By default i think mpv still uses vo=gpu
on Windows?
Atm we have two major options on windows, vo=gpu
and gpu-next
see here: https://github.com/mpv-player/mpv/wiki/GPU-Next-vs-GPU
I use the daily builds from here with gpu-next
, main reason beeing that gpu-next can "correctly" display dolby vision only files, so no wrong colors.
On the other hand gpu
clearly handles peaks/highlights differently/better than gpu-next
which is visible in the HDR10-LG-Cymatic-Jazz
and HDR10-Sony-Bravia-OLED
testfiles.
I used 0.34.0 with all defaults, so I guess I'm using gpu
(and the log I attached seems to confirm that).
I could try with gpu-next
, but according to what @jeeb said above, proper HDR peak luminance metadata transmission is not implemented anywhere even in gpu-next
, so I didn't bother.
need to wait for about 10 seconds or so as the TV "fades" into the new metadata - it doesn't switch instantly
That is a bug then. If must change every field in AVI Infoframe, previous signals for next. Gpu-next works correctly in HDR on mode of windows 11.
I am pretty sure it's the TV that does this, presumably to avoid sudden/jarring luminance changes. The reason why I think it's the TV is because the same "progressive fade" effect occurs if I override maxCLL/mastering peak in the TV hidden menu.
Windows/nvidia will not tell you what metadata it's sending, either.
There is a small patch that allows it on windows side.
Oh? Picture me interested! Where can I find this patch?
The patch is here: #9421 (comment)
Ah, okay, but that's an mpv patch. I misunderstood; I thought you had a way of making the GPU driver (e.g. nvidia) indicate precisely which maxCLL it's currently sending down the HDMI wire at the hardware level. Knowing what maxCLL is at the output of mpv is nice I guess, but there's so much that can go wrong at the OS/driver level…
had a way of making the GPU driver (e.g. nvidia) indicate precisely which maxCLL it's currently sending
That is using the API that is documented to do precisely that.
No it is not. The patch you have linked to is just verbose-logging what is being passed into the "please set this hint into the swap chain" API of libplacebo as part of the FFmpeg AVFrame (as haasn didn't feel like integrating the full CLL or mastering screen metadata into the mp_image structure itself).
There is no such API to my knowledge that returns exactly what the screen is currently configured to, unless it is vendor specific. And even in that case I do not know of such API.
Please stop, you are not being useful.
With a quick search there is an nvidia-specific API which might return the current configuration directly from the driver as a following struct: https://docs.nvidia.com/gameworks/content/gameworkslibrary/coresdk/nvapi/struct__NV__HDR__CAPABILITIES__V2.html .
(some pages mention NvAPI_Disp_GetHdrCapabilities
being the function)
With a quick search there is an nvidia-specific API which might return the current configuration directly from the driver as a following struct: https://docs.nvidia.com/gameworks/content/gameworkslibrary/coresdk/nvapi/struct__NV__HDR__CAPABILITIES__V2.html . (some pages mention
NvAPI_Disp_GetHdrCapabilities
being the function)
I suspect this will only return what the display is capable of (i.e. from its EDID), not the details of the signal it's currently being sent.
Right, quite possible. I was just doing a quick search. In that sense it would be similar to the Windows standard API that returns you the EDID information, and thus nvapi shenanigans wouldn't be required.
I did think of plugging that information into the mpv tone mapping output target brightness, but at least without the peak brightness metadata it just caused the image to lose brightness :) (although possibly if I would have set mpv tone mapping mode to clip it might have fared better)
Yeah another problem is that some displays don't populate this information in their EDID. For example my LG G1 doesn't have max luminance information in its EDID (which is a bit surprising since it's well-known to be about 750-800 nits - must have been an oversight from LG).
@dechamps Quick update after fiddling with my setup last night. I did reset the TV and started with a fresh calibration and now i can see all boxes (merges with last two) on the 4000 nits sample.
My HDR config with latest daily build https://github.com/mpv-player/mpv/commit/bb5b4b1ba61b67da40c85c34376aced9383fc366.
vo = gpu-next
hwdec = auto
target-colorspace-hint = yes
gamut-mapping-mode = clip
blend-subtitles = no
gpu-api = d3d11
gpu-context = d3d11
d3d11-output-format = rgb10_a2
fbo-format = rgba16f
d3d11-output-csp = pq
target-trc = pq
target-prim = bt.2020
hdr-compute-peak = no
PS: I still see the bug, that without explicitly setting target-trc = pq
for gpu-next
the colors are wrong, while in contrast i can omit this setting for vo=gpu
?
This is just rediculous. VK_EXT_hdr_metadata is mandated to behave correctly (same for dx12 and 11 counterparts), WTF. It is used in games too after all.
I would not trust HDR metadata APIs to behave correctly without double-checking on the final setup. OSes have bugs. GPU drivers have bugs. Incorrect HDR metadata transmission can easily go unnoticed because its effect won't be obvious unless you know where to look.
Heck, last time I checked, on Windows 10 HDR metadata APIs didn't work at all, regardless of player, despite Windows 10 having an HDR output mode. I had to upgrade to Windows 11 for it to work.
There is also the more general problem that OSes can't blindly pass these calls through in general, because multiple applications could be running at the same time with different requirements (e.g. standard SDR apps vs a HDR video player) and the OS has to do its best to keep everyone happy. For example I noticed that with VLC and madVR, in general the metadata is only transmitted if the player is full screen. Which can lead to other potential problems e.g. the OS/driver not treating some app as full screen even though it is, and dropping its HDR metadata.
@Andy2244 Using your exact config, verbatim, with latest shinchiro build mpv-x86_64-20220424-git-9d133eb
, I am still not getting proper gradation on Mehanik test sample White_900-4000nits-MaxCLL-4000-MDL-4000.mp4
. Here's the log.
@dechamps yeah i also noticed that vlc has more pronounced boxes for the colored test sections. The only way i could improve this test, was by also adding/changing those settings:
hdr-compute-peak = yes
target-peak = 1300
tone-mapping = bt.2390
tone-mapping-param = 2.0
So doing a pre-tonemap pass to the target nits, while trying to preserve the dynamic range, which is far from your untouched pass-through goal.
@dechamps FYI my MR into libplacebo got merged, so git master mpv with libplacebo version >= 4.204.0 - when used with --vo=gpu-next
and when enabling color space hints with --target-colorspace-hint=yes
should now configure the values based on input.
edit: The usual notes from https://github.com/mpv-player/mpv/issues/10158#issuecomment-1116134571 still apply of course, due to how MS handles things with regards to d3d11.
@jeeb Sorry it took me a while to get around to testing your solution. I can confirm that, with the shinchiro 20220605 build, maxCLL metadata appears to be correctly transmitted using --vo=gpu-next --target-colorspace-hint=yes
, as long as the player is fullscreen. Thanks!
(I'll just mention in passing that it would be great if these "make it work the way it's supposed to" options could be enabled by default, instead of mpv users getting suboptimal HDR rendering out-of-the-box. Presumably it's only a question of time though.)
Reproduction steps
Play a video suitable for testing HDR metadata/tone mapping.
I use the Mehanik HDR10 test patterns, specifically
02. White_Color clipping\04. White_900-4000nits-MaxCLL-4000-MDL-4000.mp4
, which usually makes it obvious if the HDR metadata is being sent correctly or not.Expected behavior
The HDR metadata is sent correctly. In the case of the test video mentioned above, a peak luminance of 4000 nits should be sent to the display; as a result the display should apply the corresponding tone mapping curve resulting in proper gradation across the band.
Actual behavior
The metadata appears to be sent incorrectly (i.e. the gradations disappear), even in full screen mode.
Additional information
Note that, strangely, the issue is not always perfectly reproducible. Most of the time mpv won't be able to get the metadata across, but once in a blue moon it actually works. It's not clear to me what triggers it.
At first I thought it could be triggered by using mpv after using another HDR-capable player first (e.g. VLC or madVR), but that doesn't seem to always be the case.
This smells like a OS or GPU driver bug, but I will point out that neither VLC nor madVR ever seem to get this wrong (as long as they are in full screen mode) - as far as I can tell only mpv is affected.
Log file
log.txt