Closed MelvinSmiley closed 2 months ago
What is actually displayed on your screen and what is rendered by the GPU are two completely different things. If the game output more nits that TV can handle, it will either clip if it's in HGiG mode, or TV will try to tonemap it back into correct range if it uses Dynamic Tonemapping (DTM) option.
What is actually displayed on your screen and what is rendered by the GPU are two completely different things. If the game output more nits that TV can handle, it will either clip if it's in HGiG mode, or TV will try to tonemap it back into correct range if it uses Dynamic Tonemapping (DTM) option.
This TV doesn't have HGIG, only Gradation/Brightness preferred and Off. I read that the first two should only be used in applications that don't allow you to set max nits so I have it on OFF. But thanks for your explanation, makes a lot of sense.
What is actually displayed on your screen and what is rendered by the GPU are two completely different things. If the game output more nits that TV can handle, it will either clip if it's in HGiG mode, or TV will try to tonemap it back into correct range if it uses Dynamic Tonemapping (DTM) option.
correct
I'm sorry if this is a dumb question, but how can it be that the measured brightness in HDR in Cyberpunk 2077 with the HDR analysis tool is 958 nits when my Sony 90XJ 75-inch can only output 753 nits in its 10% peak window according to rtings? Doesn't make sense to me.