Open Orgak opened 5 months ago
DXVK always opens your games on the primary monitor. And I'm pretty sure native D3D11 drivers do the same thing.
With native drivers, this problem does not occur. It opens on the monitor that is marked as the primary and then switches to the secondary. The only way I can play the game is if I unplug the cable from a monitor.
Its funny, because when i play this games on Manjaro OS the problem does not exist
Sadly running into this myself now. It seems that DXVK will ignore the actual primary monitor in favor of whatever monitor happens to set in index 1. On my system, this is completely random - monitor 1 could even be the VR goggles, since they tend to be first to respond. While I love the performance gain, I'm not a particular fan of having to turn my head 90° sideways to play games on my vertical side monitor.
DXVK always opens your games on the primary monitor. And I'm pretty sure native D3D11 drivers do the same thing.
I've spent about half a day looking into the code of DXVK, but I couldn't find a hint of this behavior. There's no mapping between IDXGI::EnumAdapters
and EnumDisplayMonitors
at all. The latter would be required to even know if a monitor is set as primary in the DWM settings. For D3D9ex and later apps, this file would be the place to look for this, but I can't find either of these calls.
wsi::enumMonitors
calls EnumDisplayMonitors
on win32 platforms.
If that function returns monitors in a random order, it's not clear what we need to do to to sort them correctly. I guess we could add a hundred lines of boilerplate to figure out which monitor intersects (0,0), but it's not clear if that's supposed to be reliable and it will keep the remaining monitors in a random order too, and what's worse is that this will break games that rely on specific relationships between DXGI outputs for a given monitor and displays enumerated by different APIs.
There's also a known bug on some Windows setups where DXVK doesn't list all (or sometimes, any) display modes for a given monitor (which I cannot reproduce on any of my setups), and it's not clear why, but apparently EnumDisplaySettings
doesn't really enumerate display settings either. We can't really change much here though without testing literally hundreds of games on all sorts of different setups because matching specific details across several unrelated APIs is extremely difficult.
If that function returns monitors in a random order, it's not clear what we need to do to to sort them correctly.
You can either use the boiler plate code, or just call GetMonitorInfo. The MONITORINFO structure has a single flag field that tells you if it is primary or not. There's also this alternative, written by Microsoft, for systems prior to Windows 2000:
HMONITOR GetPrimaryMonitorHandle()
{
const POINT ptZero = { 0, 0 };
return MonitorFromPoint(ptZero, MONITOR_DEFAULTTOPRIMARY);
}
The return value of that is a HMONITOR, which will always be the primary monitor.
Edit: As far as monitor order goes, IDXGIAdapter::EnumOutputs always lists the primary monitor first, then goes by id as normal. On some drivers, the primary monitor will be listed twice, once with a generic "Primary Monitor" name, and once with its actual name.
Apparently there's a shorter form of this called EnumDisplayDevices, though it lacks the HMONITOR reference.
Again, the main concern here is that we really need very strong guarantees that changing the default behaviour here isn't going to break a whole bunch of games that rely on DXGI being consistent with other Windows APIs. It's not something we can "just fix" even if the fix is like 10 lines of code.
If wine always lists the primary monitor first anyway then that would be good enough, but I'm not sure if it does, and I don't really have a way to do any sort of testing on this either.
I made a small tool to compare the output between native DXGI and DXGI-on-DXVK. Both were run in a native Windows environment, which is the environment that should be matched anyway. First the code:
#include <cinttypes>
#include <exception>
#include <iostream>
#include <memory>
#include <stdexcept>
#include <Windows.h>
#include <dxgi.h>
#include <wrl.h>
using namespace Microsoft::WRL;
int32_t main(int32_t argc, const char* argv[])
{
DISPLAY_DEVICEW ddw;
ddw.cb = sizeof(DISPLAY_DEVICEW);
{ // EnumDisplayDevices
std::cout << "EnumDisplayDevices:" << std::endl;
for (size_t idx = 0; EnumDisplayDevicesW(nullptr, idx, &ddw, 0) != 0; idx++) {
printf_s("%3llu: %ls, %ls, %ls, %ls\n", idx, ddw.DeviceName, ddw.DeviceString, ddw.DeviceID, ddw.DeviceKey);
}
std::cout << std::endl;
}
{ // EnumDisplayMonitors
std::cout << "EnumDisplayMonitors:" << std::endl;
struct monitorData {
size_t idx = 0;
MONITORINFOEXW info;
} yv;
yv.idx = 0;
yv.info.cbSize = sizeof(MONITORINFOEXW);
EnumDisplayMonitors(
NULL, NULL,
[](HMONITOR hMon, HDC hDC, LPRECT rect, LPARAM ptr) {
auto yv = reinterpret_cast<monitorData*>(ptr);
if (GetMonitorInfoW(hMon, &(yv->info)) != 0) {
printf("%3llu: %ls [%ld, %ld, %ld, %ld] %s\n", yv->idx, yv->info.szDevice, yv->info.rcMonitor.left, yv->info.rcMonitor.top, yv->info.rcMonitor.right, yv->info.rcMonitor.bottom, yv->info.dwFlags == MONITORINFOF_PRIMARY ? "Primary" : "");
}
yv->idx++;
return TRUE;
},
reinterpret_cast<LPARAM>(&yv));
std::cout << std::endl;
}
try {
std::cout << "IDXGI EnumAdapter/EnumOutputs:" << std::endl;
auto hMod = LoadLibraryW(L"dxgi.dll");
decltype(CreateDXGIFactory)* createDXGIFactory = reinterpret_cast<decltype(createDXGIFactory)>(GetProcAddress(hMod, "CreateDXGIFactory"));
ComPtr<IDXGIFactory> fac;
if (createDXGIFactory(__uuidof(decltype(fac)), &fac) != S_OK) {
throw nullptr;
}
ComPtr<IDXGIAdapter> adp;
for (size_t adx = 0; fac->EnumAdapters(adx, &adp) == S_OK; adx++) {
DXGI_ADAPTER_DESC desc;
adp->GetDesc(&desc);
printf("%3llu: %ls\n", adx, desc.Description);
ComPtr<IDXGIOutput> out;
for (size_t odx = 0; adp->EnumOutputs(odx, &out) == S_OK; odx++) {
DXGI_OUTPUT_DESC desc2;
out->GetDesc(&desc2);
MONITORINFOEXW info;
info.cbSize = sizeof(decltype(info));
GetMonitorInfoW(desc2.Monitor, &info);
printf(" %3llu: %ls [%ld, %ld, %ld, %ld] %s\n", odx, desc2.DeviceName, desc2.DesktopCoordinates.left, desc2.DesktopCoordinates.top, desc2.DesktopCoordinates.right, desc2.DesktopCoordinates.bottom, info.dwFlags == MONITORINFOF_PRIMARY ? "Primary" : "");
}
}
} catch (...) {
}
return 0;
}
And now the results, first being native DXGI on Windows:
The behavior of DXVK is very clearly wrong, at least since Windows 8. Prior to Windows 8, the behavior varied by Driver and GPU, but usually the primary monitor was always the first one listed. DXGI was never consistent with other Windows APIs - it was always a driver-dependent mess up until Windows 8.
I'm not arguing that our behaviour is technically wrong, I just don't feel comfortable changing it because it's extremely difficult to test (I can't even replicate the actual issue on any of my setups). Again, we need a very strong guarantee that changing this cannot regress Proton in weird ways and we don't really seem to have that right now, nor do I want to introduce Windows-only code paths.
Last time we touched monitor stuff it look like one and a half years to get enough testing that we were comfortable merging it, and even then we had issues with it for a while.
It is extremely unlikely that this will introduce anything that breaks, as you could never have a 1 to 1 mapping between any of the APIs used to enumerate monitors. Any application that targets Direct3D 10 or later will continue working just fine. The only place where you could even break anything is if you somehow ran a Direct3D 9 (normal 9, not 9ex) application on top of DXGI and expected it to just work. There's no known application that does this, since Windows itself doesn't even support this.
There's games that literally break when the graphics driver dll doesn't start with "ati" or "nv", i'm not trusting this blindly.
Okay, I need you to explain. What could possibly happen by properly emulating the API that an App was designed for? We're not talking renaming files, we're talking apps that are designed and built for Windows with DirectX. A single IDXGIOutput entry being moved to the front of a list, where it belongs and always has been on Windows.
Games don't just use DXGI to do monitor stuff, games use 30 different APIs at the same time and expect them to match.
Even if it's not documented.
Even if it's technically wong on Windows.
I don't want to break shit on Wine for no reason, that's all. We can't touch monitor / windowing stuff without basically testing the entire Steam catalogue against it because literally any time we've done so in the past, it has been a disaster.
I'm still confused how this would even remotely break Wine at all. If Wine lists the monitors in the order of primary first by default - as stated in the first reply - , then moving the primary monitor to the front of the list will not change behavior at all. In fact, how would it even, monitor 0 would still be monitor 0.
And no matter how many APIs you load in, no matter how different they are, since Windows 7s GPU-accelerated DWM, you are required to render through DXGI. This rule is the same for any API, Direct3D, Direct2D, DirectDraw, OpenGL, Vulkan, GDI, GDI+, you name it - it has to go through DXGI. Many of these do it implicitly, so that you don't have to do it. Sometimes this even happens in the driver, which is the case for OpenGL and Vulkan, and was the case for GDI and GDI+ until Windows 10.
Not to mention that you simply can't match EnumDisplayMonitors, EnumDisplayDevices, and DXGI EnumAdapter/EnumOutputs at all. The best you can do is match HMONITORs and HDCs, or hope that the strings between the enumerations are the same (i.e. GDI names, or registry ids). Like, I get the fear of breaking something, but keeping something broken because something somewhere might not expect DXGI-on-Wine to behave like DXGI-on-Windows feels like insanity to me.
If Wine lists the monitors in the order of primary first by default - as stated in the first reply -
Does it though?
Not to mention that you simply can't match EnumDisplayMonitors, EnumDisplayDevices, and DXGI EnumAdapter/EnumOutputs at all.
This hasn't ever stopped anybody as long as it happens to work on 90% of setups. Not to mention that D3D9 exists (and D3D9 games using DXGI exist, hello GTA 4), QueryDisplayConfig exists, NVAPI / AMDAGS exist, plain old Registry exists, and probably at least 10 other ways of retrieving some sort of display info. We're also in a position where we have games silently working around Proton issues so even fixing a bug can be a bad thing. And I cannot test any of this properly because none of my setups have this issue to begin with.
FWIW, there are other examples where we're known to be broken in some cases and leaving it broken just seems like the better idea, e.g. our DXGi does not intercept window messages. This is a problem for a few games but the alternative would be dealing with insane bugs for literal years because we keep getting some tiny little detail wrong somewhere, like we do in the D3D9 implementation where we unfortunately don't have a choice.
Does it though?
I wouldn't know. If it is a proper translation layer, then it will mimic Windows to the dot, including listing monitors in the same order Windows does - which would reproduce this issue as well. Hell, in many cases, Wine is a better more accurate emulation
And I cannot test any of this properly because none of my setups have this issue to begin with.
It's incredibly simple to reproduce, as evidenced by provided data. All you need is two monitors, then set the monitor with the 2nd index as the primary one.
This hasn't ever stopped anybody as long as it happens to work on 90% of setups. Not to mention that D3D9 exists (and D3D9 games using DXGI exist, hello GTA 4), QueryDisplayConfig exists, NVAPI / AMDAGS exist, plain old Registry exists, and probably at least 10 other ways of retrieving some sort of display info.
As you may have already realized, none of those approaches will work.
In the GDI days, you would need a HDC that matches the monitor of choice, and in DXGI days, you would need the matching IDXGIOutput. You don't control the HDC path anyway, but you do control the IDXGIOutput path. As already explained, since the introduction of a GPU-accelerated DWM, there is no way around DXGI for rendering 2D or 3D to the screen efficiently. None. Your fears can't be backed up by reality.
If an app bypasses DXGI entirely for rendering, i.e. uses the old GDI(+) way, then that is no longer a problem for DXVK. Considering that the tag line is "Vulkan-based implementation of D3D8, 9, 10 and 11 for Linux / Wine", which is clearly not the case right now, your goal should be to mimic the DirectX behavior and not what some apps wish to experience.
There are no other ways to render to the screen in Windows. None. Nada. You have two options. That's it.
Edit: Not to mention that since Windows 8, GDI is done via DXGI as well.
I've tried everything to play the games on my second monitor (TV) but nothing works, the games start on the second monitor and then 1sec switch to the other monitor, the main one.
OS: WINDOWS 10 GPU :AMD Radeon RX 580 2048SP CPU: FX-6300
Monitor: 2560 x 1080 x 75 Hertz TV(second monitor): 1920 x 1080 x 60 Hertz