jamesstringerparsec / Easy-GPU-PV

A Project dedicated to making GPU Partitioning on Windows easier!
4.24k stars 426 forks source link

Passthrough working, but Parsec is not #285

Open phongnguyen-aveksana opened 1 year ago

phongnguyen-aveksana commented 1 year ago

I am on a Windows 11 laptop and have successfully passed through both my integrated graphics (AMD) and my RTX 3050Ti to my Hyper-V VM. However, I can't get Parsec working. I get the 14003 error when connecting through Parsec.

I read about the virtual monitor driver thing and installed it. And when I disable my RTX 3050Ti in device manager, meaning the VM is using integrated graphics, then Parsec connects with no problem. That means the virtual monitor is working.

I figured that the problem is that when I enable both cards, Parsec in my VM is using my RTX 3050Ti for output and not the integrated graphics, which is required for Parsec to function. I tried to force Parsec to use integrated graphics in Windows graphics settings (Power saving option), but apparently that doesn't do anything. I can't install NVIDIA Control Panel on the VM, so that's not an option.

Does anyone know if there's a way to force Parsec to use integrated graphics in Hyper-V, or force Hyper-V to use integrated graphics as default output device when both integrated and dedicated graphics are enabled?

Kodikuu commented 1 year ago

Only pass through the GPU you want it to use.

On Sun, 7 May 2023, 18:32 phongnguyen, @.***> wrote:

I am on a Windows 11 laptop and have successfully passed through both my integrated graphics (AMD) and my RTX 3050Ti to my Hyper-V VM. However, I can't get Parsec working. I get the 14003 error when connecting through Parsec.

I read about the virtual monitor driver thing and installed it. And when I disable my RTX 3050Ti in device manager, meaning the VM is using integrated graphics, then Parsec connects with no problem. That means the virtual monitor is working.

I figured that the problem is that when I enable both cards, Parsec in my VM is using my RTX 3050Ti for output and not the integrated graphics, which is required for Parsec to function. I tried to force Parsec to use integrated graphics in Windows graphics settings (Power saving option), but apparently that doesn't do anything. I can't install NVIDIA Control Panel on the VM, so that's not an option.

Does anyone know if there's a way to force Parsec to use integrated graphics in Hyper-V, or force Hyper-V to use integrated graphics as default output device when both integrated and dedicated graphics are enabled?

— Reply to this email directly, view it on GitHub https://github.com/jamesstringerparsec/Easy-GPU-PV/issues/285, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD6U6BY6VC3HFBINXJF36FDXE7MCTANCNFSM6AAAAAAXZAQ6HA . You are receiving this because you are subscribed to this thread.Message ID: @.***>

phongnguyen-aveksana commented 1 year ago

Update: When I plugged it in a real extra monitor, Parsec just works (so it will most likely work with a dummy plug as well). Actually, when I plugged, connected Parsec then unplugged the monitor, even when I restarted Parsec and restarted the VM, it still works. However, when I restarted the main computer, the 14003 error appears again.

I will probably just buy a dummy plug. It's just a shame that it can't work with just software.

Remondor commented 4 months ago

have you tried,disable integrated graphics,and add a software virtual screen to the parent pc?