Closed afontenot closed 3 years ago
@haasn I hope you don't mind me tagging you on this, you seem to be the expert on mpv's handling of color spaces, could you explain what mpv is doing with the curves and say whether it's possible to get it to adapt the video to my profiled display?
I did a little more looking into this. Here's how I know for sure there's a problem and how anyone can replicate it for themselves.
I use mpv on a wide gamut screen with a gamma pretty close to 2.2, not adjustable. MPV is set to use an ICC profile I created for this screen. If you can replicate those conditions, you'll be able to see the following bug.
Open a file encoded in Rec. 709 with mpv --no-config
. Save a screenshot. Open the screenshot in a color managed image viewer (I like eog). Now open the same file in mpv with your regular config, i.e. icc-profile=
. Compare the images.
You'll notice that the images look exactly the same (or close enough). But they shouldn't! The screenshot is not tagged, because mpv sets screenshot-tag-colorspace=no
by default. This means that the way mpv displays a Rec. 709 video (with color management) is the same way that an image viewer displays an sRGB image (with color management). But I know this is wrong, because a Rec. 709 video is supposed to be displayed with a Rec. 1886 EOTF.
Now, tag the screenshot with a Rec. 1886 profile (a profile that specifies 709's primaries and 1886's gamma curve). Open it up in the image viewer. I immediately see that the image has become much darker, because eog is (correctly) rendering an image with a designed EOTF of 2.4 for a screen with a ~2.2 EOTF. mpv does not do this, unless I specific --target-trc
, but that apparently doesn't work when using an ICC profile, but without the profile, the colors are ruined on my screen.
So what I'd like is the ability to have mpv's color management not only correctly render the colors for my screen's primaries, but also correctly render the intended display gamma of a file for my screen's actual gamma.
Here's a screenshot that shows the issue. (Due to color management, you shouldn't expect the colors to be 1:1 with your screen, but it does clearly show the difference.) The bottom right is with color management disabled, to show how --target-trc
can approximate the correct EOTF for my screen, but the colors are now wrong (oversaturated).
You can probably achieve what you want to accomplish by overriding the video's tagged TRC using e.g. --vf=format:gamma=gamma2.2
?
You can probably achieve what you want to accomplish by overriding the video's tagged TRC using e.g.
--vf=format:gamma=gamma2.2
?
Hey @haasn, thanks for the reply, I hope you're doing well!
I think we may have miscommunicated on what exactly i was trying to do, but actually the results I get from --vf=format:gamma=gamma2.4
(not gamma2.2
) are almost perfect! That is, they match the output I see from a color managed image viewer using the tagged still from the video I mentioned in my previous comment. But there may be more complicated reasons for this than I originally believed.
I believe the author of the color profiles that I used is using a pure 2.4 power curve for their "Rec1886" profile. Is that a mistake, should someone let them know? If a pure power curve is wrong, it would explain why the tagged image is so much darker than mpv's default. Maybe someone knows of a better Rec1886 profile to use with PNG images?
In any case, I think I will stick with gamma2.4
for watching movies. With the default, the blacks are cranked up way too high on my screen and it makes encoding artifacts like banding way more noticeable. Gamma 2.4 gives me results that match my TV very closely - it has pretty decent black levels. Is there a way to tinker with the black level adjustment mpv is using? It looks to be overcompensating.
From the point of view of this bug report: it looks like mpv is (probably) targeting my display's response curve (so the title is wrong), but it's doing so with a gamma curve that creates quite a bit of black compensation, which looks bad on my display. This may or may not be the result of applying Rec. 1886 accurately, rather than using a pure power curve.
From the point of view of this bug report: it looks like mpv is (probably) targeting my display's response curve (so the title is wrong), but it's doing so with a gamma curve that creates quite a bit of black compensation, which looks bad on my display. This may or may not be the result of applying Rec. 1886 accurately, rather than using a pure power curve.
Yes, that is almost certainly what is going on. This is opening up a can of worms, unfortunately.
The BT.1886 equation takes the display black and white levels as input and only specifies a pure 2.4 curve when the black level is perfectly zero. Any increase in the black level causes it to quickly raise shadows. It does not tend to look very good on a typical 1000:1 computer LCD. At that contrast ratio it tends to look like the sRGB encoding curve that also raises shadows. Whether it looks good with a better contrast ratio is a matter of opinion. I think it still looks a little washed out even on a 4500:1 TV and prefer a 2.2 power curve for normal viewing.
If your display profile reflects the black level (the tone response curve does not go all the way to zero) then mpv will estimate the contrast ratio and adjust the BT.1886 equation accordingly. mpv's LCMS2-based transform appears to correctly translate from this assumed encoding to your display's measured curve and gamut.
This will sound a little strange if you're used to ICC color management because you're changing the source's profile to match your output device's characteristics. The correct BT.1886 profile to use for PNG images would be one with the properly calculated curve for your display's contrast ratio. This makes some sense if you consider that most pro video is not color managed in the ICC sense and is based on what looked good on the reference monitor. (kind of like how you can simulate another monitor by applying its color profile to an image and using absolute colorimetric intent, except we're assuming the reference display was close to e.g. Rec 709 primaries with gamma 2.2)
The standards for those displays are clear as mud, though, and is subject of much debate in A/V calibration and color grading forums. People in the know claim a pure 2.4 power curve is the de facto standard for movies, dramas, etc. these days but older content and live TV look better with a 2.2 power curve. Pretty much everything looks good with a 2.2 curve unless you're in a very dark room.
In short, I don't think mpv is doing anything wrong. The overcompensation is baked into the BT.1886 equation. If you use a player that can apply a precomputed 3D LUT, a transform computed by DisplayCAL will have the same characteristic, because it too knows the display's black level and applies the equation correctly. The gamma-factor
option in mpv allows an adjustment on top of BT.1886 if you'd like to make it darker.
The way I like to think about it is that BT.1886 and ICC handle black point adjustments differently. ICC profiles scale the output transfer function on the "input side", that is, you take the linear-light input and stretch it (linearly) to fit the output transfer function (linearly).
BT.1886, instead, scales the transfer function on the "output side", that is, you take the resulting signal and stretch it (non-linearly) to fit the display range.
One makes more sense from a signal processing perspective (ICC methodology) since it simplifies interoperation and also matches physical intuition. The latter, arguably, makes more sense from a human psychovisual perspective (BT.1886 methodology), because it means the black point adjustment is "perceptually linear".
The reason this is such a can of worms is that both methods make sense in context. If anything, what's tricky is trying to mix the two approaches.
Thanks for the detailed responses to both of you! I think I understand what's happening from a technical perspective better now.
Pretty much everything looks good with a 2.2 curve unless you're in a very dark room.
That might explain it, because any serious viewing / work I do is usually in a pretty dark room.
Personally I think I'll stick with forcing the 2.4 curve. The extra bit of contrast looks right to me in the environment I'm in, and it seems both 2.2 and 2.4 get rid of the nasty black level compensation. (Is that why 2.2 is darker on my screen than the default Rec. 1886? Intuitively I would have expected the opposite, since Rec. 1886 is based on a 2.4 curve.)
The correct BT.1886 profile to use for PNG images would be one with the properly calculated curve for your display's contrast ratio.
Does this mean that it's not possible to have a universal ICC profile for PNG images that instructs a color managed viewer to treat the image as authored under Rec. 1886? Is the 2.4 gamma curve approximation the best we can do? (This would imply that it's not possible to have screenshots that perfectly match the original film frame, because the images will be treated differently by the image viewer than the video file will by mpv. Maybe it would be better to set target-trc when taking screenshots so you get an sRGB adapted version of the screenshot...)
If your display profile reflects the black level (the tone response curve does not go all the way to zero) then mpv will estimate the contrast ratio
Is there any way to set this value manually? I'm using an older Spyder model on a decent (but not fantastic) quality screen, and the precision at the darkest levels in the resulting profile is not great. If I do two runs back to back with the Spyder I can end up with two rather different looking profiles in the darkest shadows. I see there's --icc-contrast
but that only sets an upper limit according to the man page.
Yes, haasn raises a good point. Everything I said about 2.2/2.4 gamma curves should be taken to mean 2.2/2.4 gamma curves with output offset black compensation. Otherwise, values close to black that are under the black point of your monitor are clipped.
Is that why 2.2 is darker on my screen than the default Rec. 1886? Intuitively I would have expected the opposite, since Rec. 1886 is based on a 2.4 curve.
This may help answer your question: https://www.lightspace.lightillusion.com/error.html#bt1886_gamma Note that the example monitor has a 2000:1 contrast ratio. The BT.1886 curve for a 1000:1 screen is just a little darker than 2.2 at the top end, and goes below gamma 2.0 at the lower end. (note that Light Illusion goes as far as to call BT.1886 an error!)
Does this mean that it's not possible to have a universal ICC profile for PNG images that instructs a color managed viewer to treat the image as authored under Rec. 1886? Is the 2.4 gamma curve approximation the best we can do?
I'm not sure. I think the viewer would have to do something like what mpv does and compute the BT.1886 curve given your display's profile. It is possible because mpv uses ICC profiles, after all, but I don't know that any image viewers implement it. I don't know what the right approach is using a standard color-managed image viewer.
If you are making screenshots for web, this thread may be interesting:
I see there's --icc-contrast but that only sets an upper limit according to the man page.
I think that's the same thing. The only thing that matters in the equation is the ratio between the white level and black level. The resulting curve is the same for e.g. 0.1 nits black/100 nits white and 1 nit black/1000 nits white. (just with output level scaled 10x, of course)
I see there's --icc-contrast but that only sets an upper limit according to the man page.
I think that's the same thing. The only thing that matters in the equation is the ratio between the white level and black level. The resulting curve is the same for e.g. 0.1 nits black/100 nits white and 1 nit black/1000 nits white. (just with output level scaled 10x, of course)
Maybe I misunderstood something, but if it sets an upper limit on the ratio (which is what the man page says) then that would mean it can't do what I want. Because mpv is overcompensating for black levels because it assumes my contrast ratio is lower than it actually is (as a result of the display profile being inaccurate close to black). So unless I can set the precise ratio to use, or set a minimum, I won't be able to fix the ratio and reduce the amount of black compensation.
Thanks again for all the information and especially that Light Illusion link. I hadn't seen that before but it's very informative.
If you are making screenshots for web, this thread may be interesting:
Yep this seems to be basically the same idea just applied to video instead of screenshots. I'm not sure why you would do this for video unless browsers are treating tagged 709 video as if it were sRGB (yikes!). For images it's probably easier and safer, since pretty much every color managed image viewer is assuming sRGB for untagged images, just to do --target-trc=srgb
. Or you could just say that behavior around gamma varies too much from screen to screen anyway, and just ship whatever's in the video files (although, does that mean the raw 1.8 camera gamma, or the ~2.4 output gamma?!)
Sorry for droning on, but that suggests a possibly important question. If I take a screenshot in mpv with the default settings, the output is what I get from displaying the image with BT.1886, right (with a target TRC of 2.2)? But is that BT.1886 with black levels compensated for my screen, making the result inaccurate for anyone else, or is it using some approximation of a "generic" screen? It hadn't occurred to me before that it might vary in this way from device to device.
Intuitively I would have expected the opposite, since Rec. 1886 is based on a 2.4 curve.
This intuition is the wrong way around. The lower you make the EOTF gamma, the brighter the image will get. Consider the limiting case of having a gamma of 1.0
, in which case the compressed image will be treated as linear light, and a gamma of infinity, in which case the entire image will get squished to 0 by the mapping.
Does this mean that it's not possible to have a universal ICC profile for PNG images that instructs a color managed viewer to treat the image as authored under Rec. 1886?
Correct.
I see there's
--icc-contrast
but that only sets an upper limit according to the man page.
An upper limit should be sufficient in practice. Your ICC profile either contains contrast information (in which case the option is unneeded), or it contains none (in which case the measured contrast will be infinity).
But I guess if your profile contains errors near black, this could be insufficient after all. I guess I'll change it.
Does this mean that it's not possible to have a universal ICC profile for PNG images that instructs a color managed viewer to treat the image as authored under Rec. 1886?
Correct.
Interestingly, while no one seems to think this is a problem for BT.1886, there's a spec for putting BT.2100 PNG images on the web: https://www.w3.org/TR/png-hdr-pq/
This intuition is the wrong way around. The lower you make the EOTF gamma, the brighter the image will get.
I was referring to modifying the tagged gamma with --vf=format:gamma=
. When using this option gamma2.2 is brighter than gamma2.4, which is brighter than gamma2.8 (which is what I expect - the more powerful the gamma curve, the darker the image).
In any case, I think I will stick with
gamma2.4
for watching movies.
@afontenot: Rather than add --vf=format:gamma=gamma2.4
to the config, which will also mistag HDR videos. As another workaround, you can modify the black point in the ICC profile to zero with ICC Profile Inspector, then set --icc-contrast=inf
. mpv will calculate and use a contrast of infinity, producing a pure gamma 2.4 curve with identical results. Note that this could mess with black point compensation in other color managed programs, so you may want to set icc-profile
manually with --icc-profile=<profile>
instead of --icc-profile-auto
Is that why 2.2 is darker on my screen than the default Rec. 1886? Intuitively I would have expected the opposite, since Rec. 1886 is based on a 2.4 curve
BT.1886 isn't darker than gamma 2.2 at 10% level until contrast is higher than 8000. BT.1886 not being a pure gamma curve, it has an effective gamma that varies from black to white. Here's a quick visualization I made:
Dashed lines are gamma 2.2 and 2.4, which are constant regardless of video value. Red is sRGB, blue is BT.1886 with a 1000:1 contrast, and green is BT.1886 with 5000:1 contrast. Notice that even with a 5000:1 contrast, gamma at 10% is actually 1.85, at 5% is only 1.51, and it only gets worse from there! The math of the curve forces the gamma approach 0 as you go into the blacks, which means that unless you have some ridiculous contrast value like 1,000,000:1, BT.1886 will always have lifted blacks and can never match anything resembling a gamma curve, no matter what gamma you chose.
BT.1886 will always have lifted blacks and can never match anything resembling a gamma curve, no matter what gamma you chose.
It's for exactly this reason that I think such 'theoretical' comparisons are useless. You should be comparing it against what a black-point-adjusted gamma curve would look like on an actual monitor with that contrast (versus BT.1886 tuned for that monitor).
@haasn:
Although most HD video is delivered with the BT.1886 EOTF, it seems that most professional video is actually graded on monitors calibrated to a pure of gamma curve of 2.4 or 2.2, not BT.1886. Instead of following the standard, video is graded on a gamma curve and just incorrectly tagged as BT.1886, causing most content to appear to have lifted blacks. To get the intended display, you instead just have to assume it was graded to gamma 2.4 and change settings in software accordingly, or calibrate your own monitor to gamma 2.4.
This is kinda awkward, since if the default mpv config follows the spec, it won’t give you the intended display on most videos. Even worse is that if you calibrate and use an ICC profile, it still looks wrong, even though you think you’re calibrated.
Further complicating matters, it isn't always easy to fix even if you’re aware of this issue, most other applications assuming sRGB, even when color managed, makes the configuration a bit of a pain. You have to either calibrate the whole desktop to gamma 2.4 and get wrong colors everywhere except in videos, or calibrate the desktop to sRGB and figure out which videos need to be interpreted as gamma 2.4 and configure it every time.
I’ve noticed there are a few related issues as well: https://github.com/mpv-player/mpv/issues/8009, https://github.com/mpv-player/mpv/issues/7840
--icc-contrast=1000
default limit, which should be inf
in order to match the spec, and let BT.1886 do as much compensation as it wants.--icc-contrast
defaulting to inf
instead of 1000
, same as in uncalibrated mode1000
instead of inf
, would be a better compromise.--icc-profile-auto
by default to use ICC profile automatically if configured in the OS.--target-trc=auto
default to --target-trc=srgb
for SDR videos, instead of the current default disabling any adaptation)
~~Also, enabling --target-trc
currently assumes a BT.1886 black level of 0, giving infinite contrast, and producing a pure gamma 2.4 curve. I know --icc-contrast
is intended to prevent that under calibration, but I think this is actually the best default.
This follows the spec on high contrast/OLED displays, while slightly crushing blacks on other displays. But, if most video is mastered at gamma 2.4 anyways, then having BT.1886 fall back to 2.4 gamma when the display contrast is unknown would produce reasonable results most of the time, even on uncalibrated 1000:1 displays. And for what it's worth, this looks better to my eye, in either case, than the current default of --target-trc=auto
disabling adaptation~~
On further investigation, gamma 2.4 does actually look pretty bad on some videos. These may actually be mastered for gamma 2.2 or BT.1886. Perhaps a default contrast of 1000
instead of inf
, would be a better compromise.
--icc-contrast
to set contrast directly instead of setting a maximum contrast. I don’t know what was the intention behind setting a maximum, but I think setting an exact value makes more sense.
This would be useful for people who don’t like the contrast you get by following the spec and want to set it to a specific level instead. You’d be able to set it to either a higher or lower contrast, without the uncertainty of not knowing if it’s having an effect, or having to use other workarounds to achieve the intended effect.
If you instead wanted to maintain the same contrast across calibrated displays with different contrast, then setting --icc-contrast
to the same value on all of the displays should produce similar results, even if some of displays have contrast below the set value.
This also makes it easier to increase contrast or interpret BT.1886 videos as gamma 2.4. Rather than having a script detect a BT.1886 EOTF and add vf-add format:gamma=gamma2.4
to each video, we could instead add --icc-contrast=inf
to the global config.--icc-contrast=-1
as a default to use calculated contrast from the activated ICC profile, but without the current upper limit of 1000. This should get you BT.1886, following the spec.--icc-contrast
when not profiled. Currently, if you set --target-trc=srgb
on at BT.1886 video, mpv assumes a black level of 0, and you can’t change the contrast. But often, people do know the contrast from monitor specs or reviews. Users without calibration would be able to set this option to get a result closer to BT.1886 spec, if desired.--bt1886-contrast
since it can now be applied without an ICC profile, and because it’s only applied to BT.1886 EOTFs--target-trc=disabled
as an option to explicit disable any adaptation. That way you can let the display perform HDR tone mapping, and may help in debugging.This tries to follow the spec by default whenever we have enough information to do so, while also providing defaults that should make real world videos look closer to intended in most scenarios, whether or not you’re calibrated. And if desired, the defaults can be more easily changed (i.e. without scripting) to get any desired behavior on an individual or global basis.
Anyways, I just came out of a deep rabbithole, and wanted to offer my suggestions.
BT.1886 will always have lifted blacks and can never match anything resembling a gamma curve, no matter what gamma you chose.
It's for exactly this reason that I think such 'theoretical' comparisons are useless. You should be comparing it against what a black-point-adjusted gamma curve would look like on an actual monitor with that contrast (versus BT.1886 tuned for that monitor).
Good point. I should've done that. Sorry, this is kinda getting off topic now, but for anyone who still cares, here's updated graph that compares gamma 2.2 and 2.4 with output offset to BT.1886 with input offset, on a 1000:1 contrast ratio. Blue = 2.2 Orange = 2.4, Green = BT.1886 Zoomed into the interesting part Plus actual luminance level plot
@nevubm could you please clarify what you mean here? I have nothing to do with mpv, just a curious bystander on this issue.
Further complicating matters, even if you’re aware of this issue, most desktops assuming sRGB, even when color managed, makes the configuration a bit of a pain. You have to either calibrate the whole desktop to gamma 2.4 and get wrong colors everywhere except in videos, or calibrate the desktop to sRGB and figure out which videos need to be interpreted as gamma 2.4 and configure it every time.
Are you just talking about the 1D LUT calibration here? If you are calibrating and profiling, the video card gamma table shouldn't matter because the resulting color profile describes the monitor with the calibration 1D LUT loaded. A color-managed application should therefore look roughly the same regardless of what the display is calibrated to. (subject to real-world device limitations) I think this might be central to the issue #8009 that you linked to.
@leitec Sure. Say I want to interpret BT.1886 video as gamma 2.4 instead. One way to do this is that I calibrate my monitor to gamma 2.4 instead of sRGB, and I have the 1D LUT loaded in the GPU. Now if I play a BT.1886 video with --icc-profile-auto
, then mpv will display the video as if it was gamma 2.4, as intended.
The problem is that every non-color managed application on the desktop (so pretty much every program) is assuming you're calibrated to sRGB, which makes everything beside videos look weird. This is fine if you're doing this on a media PC that does nothing but play video, but otherwise, to get the correct look in every other program, you'd have to switch between an sRGB profile and a gamma 2.4 profile every time you start or stop a video.
@nevubm I understand what you are saying about non-CM applications on the desktop. It's the other part that doesn't follow, though, based on what I said before. The profile effectively "cancels out" the 1D LUT because the CMS's goal is to reproduce the source color space on whatever the monitor profile describes, whether that's the raw monitor without a 1D LUT or a calibrated monitor. This is easy to demonstrate in practice. I don't see how --icc-profile=auto
would be affected by the 1D LUT setting. LCMS2 doesn't touch the 1D LUT as far as I know.
It sounds like what you and #8009 want is a setting that tries to detect what your 1D LUT intends to be and use that as the source curve for the video? That might work if the calibration and profile are really accurate and the calibration follows a power curve, but won't work if the display doesn't follow a power curve very well. That can be the case on a profile that doesn't contain a 1D LUT (i.e. profile only, no calibration) which is what I use on my TV, for example.
In that case, though, wouldn't it be equivalent to use vf=format=gamma=gamma2.4
(or whatever power you want) to declare your intent explicitly?
The profile effectively "cancels out" the 1D LUT because the CMS's goal is to reproduce the source color space on whatever the monitor profile describes, whether that's the raw monitor without a 1D LUT or a calibrated monitor. This is easy to demonstrate in practice. I don't see how --icc-profile=auto would be affected by the 1D LUT setting. LCMS2 doesn't touch the 1D LUT as far as I know.
I get the feeling we're actually in agreement. The problem OP and I encountered is that when we're calibrated to sRGB, the BT.1886 input offset creates lifted blacks on many videos that are almost certainly not the output intended by the creator. To fix this we're trying to override BT.1886 and force mpv to interpret all BT.1886 EOTFs as gamma 2.4 instead.
In that case, though, wouldn't it be equivalent to use vf=format=gamma=gamma2.4 (or whatever power you want) to declare your intent explicitly?
vf format:gamma=gamma2.4
does indeed do what I'm trying to achieve. But the problem with that is that you can't just set it and forget it. If you set it in the global config, then this will tag HDR and other videos as well. And if you don't set it in the global config, then you have to write a script to check the EOTF of every video you open, then if it's BT.1886, you add this filter.
The desired result is to set some global config option (or a set of options) that will interpret BT.1886 as gamma 2.4 without affecting any other EOTFs. Such a config does not currently exist in mpv, which is one of the main things I'd like to change.
Of course, adding a way to interpret BT.1886 as gamma 2.2 could be helpful as well, for people who prefer that gamma. But that doesn't really follow any spec at all and is mostly subjective preference and guesswork of the mastering conditions, which seems a bit out of scope of the job of mpv, and therefore does actually does seem more appropriate for a script. Interpreting as gamma 2.4 is more easily justified because at least BT.1886 turns into gamma 2.4 at infinite contrast, and we can just hijack this behavior to get what we want.
The profile effectively "cancels out" the 1D LUT because the CMS's goal is to reproduce the source color space on whatever the monitor profile describes, whether that's the raw monitor without a 1D LUT or a calibrated monitor. This is easy to demonstrate in practice. I don't see how --icc-profile=auto would be affected by the 1D LUT setting. LCMS2 doesn't touch the 1D LUT as far as I know.
@leitec Oh wait, no you're right. I finally get what your saying. Calibrating to gamma 2.4 wouldn't actually change anything or help with this issue. This stuff gives me a headache...
I missed the part about HDR, and yeah, I see that you wouldn't want to declare vf=format=gamma=gamma2.4
for an HDR video. I'm content to define profiles in mpv.conf but I get why you'd want an automated way to do this.
I'm thinking I'll try to write some documentation for this and contribute it to mpv (if they'll take it), since reading different issues in the bug tracker here and elsewhere I see that there's confusion between the 1D LUT calibration (or native transfer curve) of your particular display versus what we're assuming about the source encoding (and mastering conditions) of the video.
Of course, adding a way to interpret BT.1886 as gamma 2.2 could be helpful as well, for people who prefer that gamma. But that doesn't really follow any spec at all and is mostly subjective preference and guesswork of the mastering conditions, which seems a bit out of scope of the job of mpv, and therefore does actually does seem more appropriate for a script. Interpreting as gamma 2.4 is more easily justified because at least BT.1886 turns into gamma 2.4 at infinite contrast, and we can just hijack this behavior to get what we want.
Guesswork, perhaps, but it is also justified if we consider the end user's viewing conditions. Suppose we assume that the grading/mastering was done on a gamma 2.4 display with a very dim light. The color grading book I have (Alexis van Hurkman) dates back to 2014, when 2.4 apparently started replacing 2.2 as the standard in grading suites. He notes that studios using 2.4 dial down the bias lighting. The surrounding light is much dimmer than typical room lighting, and it is therefore appropriate to brighten the image a bit by using 2.2 when viewing in the typical room setting. I've noticed that even on high-contrast displays, using 2.4 gamma in such a setting causes a loss in near-black detail. (am I an outlier for watching videos in room lighting? :wink:)
On that same vein, I've noticed that live TV and a lot of YouTube videos and the like tend to look bad with gamma 2.4 and I think it comes down to excessive saturation in dark parts of the image. (not a pro, can't confirm) The same book goes over the use of the surround effect to reduce saturation in dark parts of the image while preserving perceived colorfulness. He mentions this is typical of professional (graded) video. It certainly made a difference when I played around with the technique in Resolve while working on a 2.4 display.
Anyway, that's all to say that I've been content with gamma 2.2 as the default unless I'm watching a relatively recent movie in the dark, and that I think it makes a better default than 2.4. I suppose BT.1886 ends up working OK in the typical room lighting scenario on most displays, but that feels like an accident 🙂 (since, if I understand correctly, it's trying to create perceptual uniformity between displays with different black levels, not address differing viewing conditions)
Rather than add
--vf=format:gamma=gamma2.4
to the config, which will also mistag HDR videos. As another workaround, you can modify the black point in the ICC profile to zero with ICC Profile Inspector, then set--icc-contrast=inf
.
I just realized, editing black point tag in an ICC profile doesn't actually work as a workaround. LCMS calculates it from the profile, rather than reading it from the tag, as suggested by the ICCv4 spec, so this modification has no effect.
@haasn, do you have any thoughts about the proposed changes? https://github.com/mpv-player/mpv/issues/8082#issuecomment-754959919
I've changed the title of this issue to better reflect the actual issue that we discovered. I left a detailed note to that effect in the top level comment.
Some thoughts on the suggestions by @nevubm (note: I'm not a dev for this project).
Although most HD video is delivered with the BT.1886 EOTF, it seems that most professional video is actually graded on monitors calibrated to a pure of gamma curve of 2.4 or 2.2, not BT.1886. Instead of following the standard, video is graded on a gamma curve and just incorrectly tagged as BT.1886, causing most content to appear to have lifted blacks. To get the intended display, you instead just have to assume it was graded to gamma 2.4 and change settings in software accordingly, or calibrate your own monitor to gamma 2.4.
I think this is potentially misleading. It might be true that a lot of films are professionally graded at 2.4, but they are probably using screens with very low (possibly zero) black levels. The whole point of black level compensation is that if you display that media with gamma 2.4 on a crappy consumer screen, you end up crushing the blacks. Which is also not viewing the film as intended. So it's really just a matter of trade-offs.
[Calibrated] The only difference mpv has with the spec right now is the
--icc-contrast=1000
default limit, which should beinf
in order to match the spec, and let BT.1886 do as much compensation as it wants.
IIRC the reasoning behind this is that some ICC profiles supposedly have the contrast set way too high, so this setting was introduced to fix that problem. Of course it's the opposite of the problem that we're having in this thread. So it's possible you do need a limit. (How much difference is there between BT.1886 and 2.4 at a contrast ratio of 1000? I'm not sure...)
[Uncalibrated] Assume the display is close to sRGB, since most displays tend to target this. (make
--target-trc=auto
default to--target-trc=srgb
for SDR videos, instead of the current default disabling any adaptation)
I agree that this would be more theoretically correct. But it also goes against what most other programs do in the absence of color management, and possibly cuts against user expectations. (Most programs do not enable color management in the absence of reliable data. Should we also "correct" 601 primaries to 701? Hard to say.)
Change
--icc-contrast
to set contrast directly instead of setting a maximum contrast. I don’t know what was the intention behind setting a maximum, but I think setting an exact value makes more sense.
I think this is the ideal solution from a technical standpoint, as icc-contrast
is really misleading, it should be icc-contrast-max
. I wonder, though, if we need to worry about users who are already using this option in their configs. Changing the meaning of the option could break things unexpectedly.
Overall I'd do it like this:
Add two new options, icc-contrast-min
and icc-contrast-max
. Have the default value for both be 'none', unless 1000 is the desired maximum value.
Either use icc-contrast
to set the contrast manually, or add another option, icc-contrast-force
, to do that, and let icc-contrast
be an alias for icc-contrast-max
, but remove it from the man page and show the user a deprecation warning in the terminal output so we can remove it completely after several releases. At that point icc-contrast-force
could be changed to icc-contrast
, or just left alone. If we're using icc-contrast
to set the contrast, the default value should be 'auto'. If we're using `icc-contrast-force', the default should be 'none'.
If there's not any movement on this for a while, I'll send a pull request to do a minimal version of the above (likely skipping the idea to sunset icc-contrast
). The other suggestions, like implementing color management on screens with no known profile, are a bit above my pay grade I suspect.
If I understand this correctly, bt.1886 only matches 2.4 when you have your black levels at 0 nits and therefore your display is capable of achieving infinite contrast ratios (OLEDs). The EOTF is supposed to compensate for the higher black levels of LCDs, raising blacks to prevent black crush.
With this in mind, does leaving gamma alone on uncalibrated LCD monitors (mpv's main use case, if I had to guess) make sense? As far as I understand sRGB has a piece-wise gamma that's approximately 2.2 because computers are generally used in bright rooms, while Rec.709 went with 2.4 since you're likely going to be using your TV in dark conditions.
I understand choosing a default option pretty subjective and there's probably not an objectively correct answer here.
I might be wrong, I admit I don't know enough about this subject to be certain about what I'm typing, but it seems that mpv's defaults are tailored with TVs in mind. --target-trc=auto
, --target-trc=bt.1886
and --target-trc=gamma2.4
all look the same here on SDR content without an ICC profile, which seems to imply that mpv is assuming infinite contrast (which isn't the case).
--target-trc=auto, --target-trc=bt.1886 and --target-trc=gamma2.4 all look the same here on SDR content without an ICC profile, which seems to imply that mpv is assuming infinite contrast (which isn't the case).
That's because mpv only attempts to adjust the intended gamma to your display's actual gamma when color management is enabled. If you don't use an ICC profile this option is a no-op. So in anwser to your other question, mpv isn't assuming bt. 1886 for uncalibrated screens, because it isn't performing any gamma correction at all on those screens. A previous post suggested that mpv should target sRGB for uncalibrated screens, but this is not something that has been implemented. (And if I had to guess, it won't be.) It's not completely crazy, if you think mpv should optimize for the as-intended viewing experience in ideal (darkened) viewing conditions.
To be fair, after thinking about this for a while the current defaults are honestly reasonable. --target-trc=gamma2.2
could potentially make sense as an alternative but neither will be objectively better on all displays and all viewing conditions. The current --target-trc=auto
also makes sure images show up as they do on other colour unmanaged software, which helps with consistency.
If I understand this correctly, bt.1886 only matches 2.4 when you have your black levels at 0 nits and therefore your display is capable of achieving infinite contrast ratios (OLEDs). The EOTF is supposed to compensate for the lower contrast ratios of LCDs and lower the "effective" gamma to make the perceived image less washed out on displays with higher black levels.
BT.1886 applies input offset, while typical display calibration applies output offset. They're two different approaches to solve the same problem: that real displays can't achieve a perfect 0 nits of absolute black. (not even OLEDs, because of screen reflections)
With this in mind, does leaving gamma alone on uncalibrated LCD monitors (mpv's main use case, if I had to guess) make sense? As far as I understand sRGB favours 2.2 because computers are generally used in bright rooms, while Rec.709 went with 2.4 since you're likely going to be using your TV in dark conditions. I understand this is pretty subjective and there's probably not an objectively correct answer.
This is actually two separate issues. The first is: Which gamma curve is the native uncalibrated display closest to? And the second is: Which gamma curve is the encoded video closest to? You need to know both to get an accurate picture, and mpv gets both wrong. By default, it assumes the uncalibrated display is BT.1886 and reads the video gamma from metadata (usually BT.1886), when the display is usually closer to sRGB and video closer to gamma 2.4, despite being tagged as BT.1886. Now this behavior is consistent with most other video players, but unlikely to be the intended output by most video creators, no matter if they are big movie studios or small Youtubers, which is why it's all a big mess.
mpv isn't assuming bt. 1886 for uncalibrated screens, because it isn't performing any gamma correction at all on those screens. A previous post suggested that mpv should target sRGB for uncalibrated screens, but this is not something that has been implemented. (And if I had to guess, it won't be.) It's not completely crazy, if you think mpv should optimize for the as-intended viewing experience in ideal (darkened) viewing conditions.
By not performing any transformation, you will only get the correct output if the video encoding gamma is close to that of the display. Since video is encoded as either BT.1886 or gamma 2.2/2.4, this will match that of a TV more often that a PC monitor, most of which target sRGB. (Note: sRGB is not the same as gamma 2.2, not even close) This is effectively an implicit assumption of BT.1886, which was the rational for suggesting a default of --target-trc=srgb
instead. Also --target-trc
has nothing to do viewing conditions, it only changes assumptions about the display EOTF. Gamma adjustments for viewing conditions are configured by changing video gamma.
I might be wrong, I admit I don't know enough about this subject to be certain about what I'm typing, but it seems that mpv's defaults are tailored with TVs in mind.
Yes this was one of the issues I wanted to address. The default is good for TVs but not PCs, where it will mainly be used.
--target-trc=auto
,--target-trc=bt.1886
and--target-trc=gamma2.4
all look the same here on SDR content without an ICC profile, which seems to imply that mpv is assuming infinite contrast (which isn't the case).
Without an ICC profile, mpv has no information about the display contrast, and assumes black level of 0, giving infinite contrast. It's debatable whether this is a good fallback. The alternative is assuming some low contrast like 1:1000 that can work well enough on all displays, but doesn't look great on any.
That's because mpv only attempts to adjust the intended gamma to your display's actual gamma when color management is enabled. If you don't use an ICC profile this option is a no-op.
I think it's actually the opposite. --target-trc
is only active when no ICC profile is used in mpv. From the docs:
Specifies the transfer characteristics (gamma) of the display. Video colors will be adjusted to this curve when ICC color management is not being used.
By not performing any transformation, you will only get the correct output if the video encoding gamma is close to that of the display.
Sure, but "correct" is a somewhat vague concept given that PC screens are ~sRGB precisely because they're used in brighter conditions. If you assume whatever mpv's environment is those same, too-bright conditions, maybe it makes sense to stick with the default output of sRGB rather than guessing. It's "wrong", but there's not really a right way to do it anyway on uncalibrated screens.
I think it's actually the opposite. --target-trc is only active when no ICC profile is used in mpv. From the docs:
You're right, I forgot what this option did. Been a while sense I looked at this thread. Not sure what's causing the result they're talking about: targeting 2.2 and 2.4 should look very different, at least in darker scenes.
@nevubm Thanks for the reply! target-trc=srgb
does look subjectively better and more "contrasty" here, but aren't computer displays supposed to target 2.2?
@afontenot
Not sure what's causing the result they're talking about: targeting 2.2 and 2.4 should look very different, at least in darker scenes.
2.2 and 2.4 do look very different, it's 2.4 and bt.1886 that look the same (due to the infinite contrast assumption).
Thanks for the reply!
target-trc=srgb
does look subjectively better and more "contrasty" here, but aren't computer displays supposed to target 2.2?
The standard for computer displays has been sRGB for basically forever. Gamma 2.2 is just an approximation of sRGB but tends to be confused for the actual TRC. sRGB has an average gamma of 2.2, but it's really gamma 2.4 spliced with gamma 1.0 at around 4%.
So computer displays are supposed to target sRGB, not gamma 2.2, but of course, every manufacturer seems to have their own opinion on what looks best, and has default settings tuned accordingly. Some may target gamma 2.4 or 2.2 and market it for movies, others will stick with sRGB, some may even deliberately crush blacks to simulate more contrast, and all of this gets baked into the monitor presets like "video" or "photo". But at least the standard/neutral preset tends to target sRGB.
Having an uncalibrated display isn't actually that bad, the problem is actually when you and the manufacturer make different assumptions about the display and signal. If the manufacturer is expecting a sRGB signal, then transforms the signal to output gamma 2.4, yea it's a bit off, but it's not the end of the world. But if you feed the display gamma 2.4, when the manufacture is expecting sRGB, the display is going to transform the already transformed signal and make the error twice as bad!
You want to target the average across all monitors to minimize the potential error, and that is where following a standard helps. Some monitors will end up too bright, some will look too dark, but as long as you use sRGB, it shouldn't be too far off.
The standard for computer displays has been sRGB for basically forever. Gamma 2.2 is just an approximation of sRGB but tends to be confused for the actual TRC. sRGB has an average gamma of 2.2, but it's really gamma 2.4 spliced with gamma 1.0 at around 4%.
I understand, I was just under the impression that the sRGB gamma curve should be used for encoding, but while using a power 2.2 EOTF on displays. Long story short it seems the standard itself was rather confusing, but we can easily verify that normal consumer grade monitors at least attempt to target sRGB gamma:
This is from the RTINGS review of the Dell P2417H, an average IPS monitor. It certainly isn't exactly sRGB but it's clearly closer to it than to a pure 2.2.
What's kinda funny though is that Apple defines their "Display P3" colourspace with the DCI-P3 primaries, a D65 white point and the sRGB TRC., but then calibrates their own iPhones supposed to target it with a 2.2 EOTF. In any case, I went through a few monitors reviews in multiple websites and they indeed tend to target sRGB, while the TVs vary a bit more between 2.2 and 2.4.
Having an uncalibrated display isn't actually that bad, the problem is actually when you and the manufacturer make different assumptions about the display and signal. If the manufacturer is expecting a sRGB signal, then transforms the signal to output gamma 2.4, yea it's a bit off, but it's not the end of the world. But if you feed the display gamma 2.4, when the manufacture is expecting sRGB, the display is going to transform the already transformed signal and make the error twice as bad!
You want to target the average across all monitors to minimize the potential error, and that is where following a standard helps. Some monitors will end up too bright, some will look too dark, but as long as you use sRGB, it shouldn't be too far off. Alright so, please correct me if I'm wrong, but if I understand this correctly Rec.709, the HDTV standard, defines an EOTF of power 2.4 at infinite contrast (ITU-R BT.1886) which is what you're supposed to be using at a dark room for mastering. As we've agreed upon previously, this is perfectly fine as long as both ends agree to the same rules, and in this case those rules apply to TVs.
Since monitors target a different piece-wise gamma with lifted blacks (sRGB) and the video expected a TV with a power 2.4 TRC, of course things will look slightly off on a monitor. Going from 2.4 to 2.2 makes things slightly brighter, and that's ok considering monitors are generally used in brighter environments. But we have to consider that they aren't 2.2, the piecewise curve (as you mentioned) has a slightly different shape with slightly lifted blacks.
--target-trc=sRGB
counters this difference because you're essentially telling mpv "my display follows this TRC that may or may not match the content, so please fix things before sending it to the display". Since video is usually done with 2.4 in mind, this means mpv needs to "unlift" the blacks. And this behaviour can be easily verified.
mpv --no-config --screenshot-format=png
:
mpv --no-config --screenshot-format=png --target-trc=srgb
:
And this is probably the intended picture, but only if you also watch it in the same viewing conditions as it was mastered, which means a dark room with controlled light. In a normal room with some light hitting the display the picture will look slightly off. Using --gamma-factor=1.1
as advised in the manual lifts it up a little bit.
mpv --no-config --screenshot-format=png --target-trc=srgb --gamma-factor=1.1
:
Now we have a result that "fixes" the lifted blacks from the sRGB TRC the monitors target, while also compensating the brightness for a "moderately lit room" as explained in the manual.
Since we also agree that computer monitors are generally not used in pitch black rooms, I believe this might be a better option than just --target-trc=srgb
. Do you have any thoughts about this?
And this is probably the intended picture, but only if you also watch it in the same viewing conditions as it was mastered, which means a dark room with controlled light. In a normal room with some light hitting the display the picture will look slightly off. Using
--gamma-factor=1.1
as advised in the manual lifts it up a little bit.Now we have a result that "fixes" the lifted blacks from the sRGB TRC the monitors target, while also compensating the brightness for a "moderately lit room" as explained in the manual.
Since we also agree that computer monitors are generally not used in pitch black rooms, I believe this might be a better option than just
--target-trc=srgb
. Do you have any thoughts about this?
Adding gamma does work, and is also equivalent to assuming a video gamma of 2.2 by vf format:gamma=gamma2.2
, since 2.4/1.1 ≈ 2.2. It's not a bad assumption that most PC displays are in bright viewing conditions, but also, maybe some people just turn off the lights when they watch a movie. So I'm a bit wary of doing things like this as a default, and I'd prefer to follow the spec whenever there are uncertainties like this.
BT.1886 doesn't define any viewing conditions or methods for compensation, just a note that says "if it is required to confirm that a display device meets the reference equation it is recommended that the measurement be conducted in a dark room." So it seems like they're assuming a dark room, but strictly speaking, you don't need to be in a dark room or any compensation to follow the spec.
Besides, I feel like compensating for viewing conditions is a rough approximation and kinda subjective anyways. Gamma 2.2 in a bright room is only an approximation of gamma 2.4 in a dark room to allow you to see more dark details, but the experience is still going to be different in another environment no matter how you compensate. As an extreme example, just imagine watching something at the beach, and trying to fiddle with the knobs to get the same experience as a movie theater. So it's better to try to match viewing conditions than to compensate for it, and likely simpler as well.
Yeah I agree, the fact that mpv allows us to configure this to our liking is great, viewing conditions vary greatly between users. We should probably leave the current default behaviour without an ICC profile as it is though, "advanced" users are likely going to be configuring their player anyway and the current defaults are consistent with other programs without colour management enabled.
I'm not very good at this subject (far from it), but I had an idea. What if we record the desktop from a screen-capture software, take a screenshot during the record, and try to match the screenshot colors with the time-exact frame from the recorded video? Assuming the video and screenshot tags are correct, at whichever value of target-trc matches the screenshot colors, that should be the correct one, right?
I tried this on Windows with Nvidia's record feature and Windows's own Win+Prt Scr shortcut for screenshot and it seems no tuning was necessary. Hopefully I'm in the right direction here, and if not, they could correct me what did I get wrong
@Silver-Fullbuster No, taking a screenshot from the video itself or from a screen-recording makes no difference in this case. You can already take the screenshot as mpv shows on your screen with ctrl+s.
--target-trc
is there for you to tell mpv what your display's electro-optical transfer function looks like without using an ICC profile, you'd need to be taking pictures or using a colorimeter/spectrometer to make any half accurate comparisons.
i think the question itself has been discussed and/or answered. if there are still issues i think it would be best to open a new issue for each of them so we can tackle them one after another, if there aren't already open issues.
if this was closed prematurely, please ping me and i can reopen this one.
In fact, when I disable color management and use
And how did you disable color management in mpv? There're all variants with icc file/table manage but only not disable icc))
Note: the previous title of this issue was Any way to target my display's response curve while in ICC mode?
My original assessment of the problem was inaccurate, MPV does adapt the image to your display's actual response curve. However, it does so using BT. 1886 as the EOTF for HD media, which contains black point compensation, raising blacks to avoid crushing them below the black point of the display. This is sometimes done inaccurately, as when an ICC profile contains inaccuracies near black. Other times, a user might prefer to crush blacks slightly rather than have raised, annoying greys. In either case, the user may want to set the ICC contrast manually, but MPV does not allow this. The
--icc-contrast
option only sets an upper limit to the contrast, when the issue in this case is precisely that the contrast is too low.There's a great comment right here that spells out most of the technical details.
The rest of this post is left as is to preserve the original conversation.
This might turn out to be an issue, but at present it's just a question.
I have a laptop display that's wide-gamut SDR and profiled with a colorimeter. I've set mpv to use this profile (and it's set in my operating systems CMS - colord - and works correctly in other applications).
My display's EOTF is pretty close to sRGB / gamma 2.2, not Rec. 1886 as would be appropriate for movies. (And since it's a laptop screen, this is not adjustable.) Judging by the apparent output gamma, mpv appears to be displaying the images at my display's "native" EOTF, i.e. not converting the image from the "correct" 1886 curve to my actual curve. In other words, the image is significantly too bright, by my judgment.
mpv has the
--target-trc
option. My understanding of the man page is that when this is enabled, mpv will do what I want, i.e. adapt the gamma of the image to my screen. However, it also saysIn fact, when I disable color management and use
--target-trc
to target the sRGB curve, I get a much better picture. It seems to do exactly what I'd expect. Of course, I want and need to use color management to get correct colors on my wide gamut screen. Is there any way to get mpv to adapt to my display's gamma while in color management mode? Even better, could mpv pick up the information it needs to do this accurately from my ICC profile, so I don't have to specify a badly fitting curve to use?