Open WoundedBum opened 11 months ago
To make it clearer, RE2 for example won't be able to select, Dead Space Remake on the other hand is able to be selected.
My experience with this driver was that with my current setup which involves a hyper-v vm (win 11) running sunshine hosted on a another win 11 from where i connect with moonlight, I had to enable hdr on both desktops (host and vm) as well as in moonlight settings, and also in the game running in the hyper-v vm. Also i made sure TV has HDR turned on for the hdmi port (HDMI UHD color in my case)
Seeing the same behavior with RE2 in particular - not detecting the display as HDR. Windows pops up "auto HDR" and everything when the game is launched, but nothing. Using the Windows 11 HDR files with W11 23H2.
Was the IDD monitor set as the primary display? If not, try that!
On Wed, Dec 27, 2023, 3:46 PM Porkchopp @.***> wrote:
Seeing the same behavior with RE2 in particular - not detecting the display as HDR. Windows pops up "auto HDR" and everything when the game is launched, but nothing. Using the Windows 11 HDR files with W11 23H2.
— Reply to this email directly, view it on GitHub https://github.com/itsmikethetech/Virtual-Display-Driver/issues/15#issuecomment-1870696304, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAERAHPDPVPKKLD5ZJJM2TDYLSXMFAVCNFSM6AAAAABAXRQBM6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZQGY4TMMZQGQ . You are receiving this because you are subscribed to this thread.Message ID: @.*** com>
Was the IDD monitor set as the primary display? If not, try that!
Yes, I believe so. I have a script running in Sunshine ahead of launching that enables and makes primary the virtual display.
I should note that I have a LG C2 locally attached that has zero HDR issues. I have not tried streaming with that as the display to see if that solves the issue (to confirm that the virtual display is the problem, not moonlight/Sunshine), so I will give that a shot.
Was the IDD monitor set as the primary display? If not, try that!
Yes, I believe so. I have a script running in Sunshine ahead of launching that enables and makes primary the virtual display.
I should note that I have a LG C2 locally attached that has zero HDR issues. I have not tried streaming with that as the display to see if that solves the issue (to confirm that the virtual display is the problem, not moonlight/Sunshine), so I will give that a shot.
mind sharing the script please? i need it as well running hyper-v due to a bug in sunshine-moonlight which disconnects me first time when attempting connection. thank you!
mind sharing the script please? i need it as well running hyper-v due to a bug in sunshine-moonlight which disconnects me first time when attempting connection. thank you!
I use a tweaked version of what is described here:
https://www.reddit.com/r/MoonlightStreaming/comments/12fxk2v/stream_4k_without_any_screen/
EnableForStream.bat:
start /wait c:\multimonitortool\multimonitortool.exe /SetMonitors "Name=\\.\DISPLAY5 BitsPerPixel=32 Width=3840 Height=2160 DisplayFlags=0 DisplayFrequency=60 DisplayOrientation=0 PositionX=0 PositionY=0
start /wait c:\multimonitortool\multimonitortool.exe /SetPrimary \\.\DISPLAY5
DisableForStream.bat:
start /wait c:\multimonitortool\multimonitortool.exe /SetPrimary \\.\DISPLAY3
start /wait c:\multimonitortool\multimonitortool.exe /disable \\.\DISPLAY5
I haven't used moonlight more than a time or two with this setup, and previously had issues with a ready-made script that did not revert properly, but so far so good with this.
Two things. Try physically unplugging your tv from your gpu. I had an issue where, even with my lg c8 disabled in windows. Moonlight would still latch on to it and it would break hdr.
Also, are you sure you have hdr enabled in sunshine as well? I believe the setting is in the advanced area?
Two things. Try physically unplugging your tv from your gpu. I had an issue where, even with my lg c8 disabled in windows. Moonlight would still latch on to it and it would break hdr.
Also, are you sure you have hdr enabled in sunshine as well? I believe the setting is in the advanced area?
my 2 cents.. hdr capabilities are detected by default in sunshine as far as I observed. you shouldn't touch sunshine configuration (at least on latest version). However you must enable experimental HDR in moonlight settings and on both the server and the client.
offtopic: btw, @Porkchopp, i set the enable bat script in sunshine and it seems to help with the initial connection. I still get random firewall bug, especially if i disconnect moonlight, but seems to work on first attempt right after my hyper-v vm boot.. thank you
So I'm in the same boat.
My problem, the game Mortal Kombat 11 does not let me select HDR on the settings page
My conf.
Client Win11@23H2 with HDR enabled connected to a Samsung Q85R TV with HDR ON Graphic Card Nvidia 1030 CPU Ryzen 5 5600g
Host Win11@23H2 with HDR enabled connected to this Video Driver Graphic Card RTX4070 CPU 12600KF
Sunshine Default values
Moonlight HDR experimental enabled and showing
Game runs installed locally on the client machine with HDR ON Need to test if it runs with HDR ON on HOST PC but I think it will
I made it's primary, but it didn't help. My game where I don't see it is RDR2
Hi, I'm having the same problem with Doom Eternal. HDR on windows works correctly, and some games work too (Devil May Cry, Death Stranding). Auto HDR also works, but some titles like Doom Eternal say that the display don't support HDR when enabling it. The virtual display is the only one enabled and set to primary.
Since I don't have any of those game or a rig to test em on, and then the "might work depending o game" gives me the idea, that those games might need a certified hdr connection, which this driver doesn't have at the moment and i'm guessing won't have by default in the future. But long term goals are to be able to import EDID from a display and use it for the base configuration, hopefully without recompilation.
@zjoasan thank you for your response. At least good to know that currently we do everything correct
I made it's primary, but it didn't help. My game where I don't see it is RDR2
try set game graphic API to DirectX
I'm having the same problem here, HDR is enabled in Windows / Moonlight, and it works great in (for example) Hogwarts Legacy, Ori & The Will of the Wisps. But in Tetris Effect: Connected, or Assassin's Creed Origins, the option is greyed out. In all cases, the VDD monitor is the only active monitor (Sunshine disconnects my main monitor), and the game is set to full screen. Any ideas on why it works in some games and not others? Thanks!
Yes! Some games use calls to applications such as Nvidia control panel in order to control HDR configuration or even check of its existence, since the virtual display driver doesn't take a physical port on the GPU, therefore cannot retrieve HDR status for some games! The games will need to be converted to allow for DirectX HDR, and even then, no guarantee it will work!
Yes! Some games use calls to applications such as Nvidia control panel in order to control HDR configuration or even check of its existence, since the virtual display driver doesn't take a physical port on the GPU, therefore cannot retrieve HDR status for some games! The games will need to be converted to allow for DirectX HDR, and even then, no guarantee it will work!
Ah okay. So there's nothing I can do myself then, right? Thanks for the explanation
I mean, not really, other than you can try the things mentioned above. But as of right now, that's about it
HDR is working on the desktop perfectly, but for some reason any game launched isn't seeing it. I've tried two different clients to the same effect. Weirdly, Windows offers Auto HDR if the game is known to support it which works perfectly well.