Closed ThreePinkApples closed 2 years ago
How does the official Nvidia client behave in this scenario?
The HDR trick that is necessary to get AC Valhalla to stream in HDR doesn't work with the official client it seems. The same goes for other HDR games I have installed. But the output stays at 12bit, even when streaming non HDR, which causes lots of color banding in the Steam Big Picture background seen while a game is launching, for example. I'm downloading Far Cry 5 which I see is supposed to work with HDR on
I've been unable to get any game to run in HDR in the Nvidia client. Games like Far Cry 5 and Assassin's Creed Origins launch directly with HDR in Moonlight, but the HDR option is greyed-out when launching through the Nvidia client.
Edit: Made one final attempt by disconnect everything but the HDMI EDID converter and rebooting my PC. The Nvidia client streamed the emulated monitor, but it refused to go into HDR mode even when I was able to turn on HDR in the games.. Hopefully someone else might be able to test
Some new discoveries: Turning on Dolby Vision isn't enough, my TV also needs to have "PC mode" enabled on the input. If I turn that off but still have Dolby Vision on, and in game mode, the banding goes away. Reading up it looks like "PC mode" on the TV (LG C8) makes it treat all incoming HDR signals as 4:2:0. When Dolby Vision is on the input is 12bit 4:2:2, while with Dolby Vision off it is 10bit 4:2:0. So #usererror, but having Moonlight switch to 10bit 4:2:0 when in HDR automatically is might still be a good idea to implement.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
Describe the bug Colour banding when streaming HDR games to an Nvidia SHIELD with Dolby Vision enabled. The output from the Nvidia SHIELD is 12bit 4:2:2 when turning on Dolby Vision. If I turn off Dolby Vision the output becomes 10bit 4:2:0 and the colour banding is fixed. Apps like Plex are able to switch the output from 12bit to 10bit when playing HDR10 movies, so sounds like that should be possible for Moonlight to do as well.
Steps to reproduce
Moonlight settings (please complete the following information) I've forced H265 and selected the highest bitrate. Output is 4K60
Device details (please complete the following information)
Server PC details (please complete the following information)
Additional context I'm using an HDMI EDID emulator to be able to turn on HDR in Windows enabling me to use HDR in more games. The settings for that monitor is set to 10bit 4:2:2. The EDID is for the TV that I am eventually outputting to from the SHIELD