Open PixelyIon opened 7 months ago
Since I can't find an easy way to script "ID3D12Device::GetAdapterLuid", would this be helpfull? Haven't looked at the code to sign a GPU to the virtual driver: Powershell command:
gwmi win32_VideoController | FT Name,AdapterDACType,DeviceID,PNPDeviceID
Name AdapterDACType DeviceID PNPDeviceID
---- -------------- -------- -----------
Intel(R) UHD Graphics 630 Internal VideoController1 PCI\VEN_8086&DEV_3E9B&SUBSYS_1B3E1043&REV_00\3&1158365...
NVIDIA GeForce GTX 1050 Ti Integrated RAMDAC VideoController2 PCI\VEN_10DE&DEV_1C8C&SUBSYS_1B3E1043&REV_A1\4&1378839...
IddSampleDriver Device HDR VideoController3 ROOT\DISPLAY\0000
So is there any way to do it with released driver? I have RTX3080 and T400, wanted to use T400 for encoding, so i could use HAGS with RTX 3080 and FSR3 for frame generation. With HAGS on right now and high gpu utilization the stream is choppy at best :P
I'm using this on a laptop with an Intel iGPU as well as an Nvidia dGPU. Windows OS code currently automatically picks my iGPU as the render adapter for the DXGI swapchain which isn't desirable as my iGPU isn't nearly as good at video encoding compared to my dGPU. From a cursory look, a potential fix would involve specifying the desired adapter in the config either by its name or LUID and calling
IddCxAdapterSetRenderAdapter
at the start with the GPU's LUID to ensure it's using the right adapter. A tiny utility to enumerate the LUIDs of all GPUs present on the system to go with it would be perfect.