Open raff-run opened 1 year ago
The driver on the host and the vm need to be the same in order for it to work as far as I know.
Well, yes, they are since I followed the walkthrough.
Sorry, I was just pointing it out since you mentioned updating in the VM manually but never mentioned the host.
I used 3DMark's Fire Strike to test DX11 performance and found that the physical scores were significantly lower than the host.
Perhaps this is the reason for the low frame rates of the few games I am playing that use DX11.
I'm in the same situation. My computer configuration is i7-8700 and 1080ti. I tried rolling back the graphics card driver to 457.09, but the problem still persists. I only found these two posts on the Larian forum, but the problem still persists after trying them. (https://forums.larian.com/ubbthreads.php?ubb=showflat&Number=652940#Post652940 https://forums.larian.com/ubbthreads.php?ubb=showflat&Number=675396)
Having the same issue
I created a Windows 11 Pro VM with Hyper-V using this repo (and also tried with this fork). My host is 64GB DDR5 6000, Ryzen 9 7900x, Nvidia RTX 3070 Windows 11 Pro. I gave the host 50% of the RAM, CPU, and GPU.
When I play a DirectX 12 game, the performance is as expected. However, when I try to play a DirectX 11, the performance is terrible and it seems like the GPU isn't being utilized for some reason.
I discovered this because God of War ran just fine in the VM (\~60fps 1080p high), however a really light-weight game (Season) ran terribly (\~20 FPS 1080p low). I narrowed it down to the biggest difference between the games being the fact that GoW is using DirectX 12 and Season is using DirectX 11.
To illustrate this, here are some screenshots from Fortnite, which helpfully lets you switch between DirectX 12 and 11.
DirectX 12: High 1080p, 120fps cap: https://i.imgur.com/oOsJF5x.jpg
- Game runs fine, consistently \~120fps
DirectX11: High 1080p, 120fps cap: https://i.imgur.com/di5M1li.png
- Gave up after 5 minutes of the game at 0-1 FPS just trying to render the menu, couldn't even get in the game.
I've tried googling this issue and haven't been able to find anything. Has anyone had this issue, and if so, do you have any idea how to solve it? Thanks!
Having the same issue
I created a Windows 11 Pro VM with Hyper-V using this repo (and also tried with this fork). My host is 64GB DDR5 6000, Ryzen 9 7900x, Nvidia RTX 3070 Windows 11 Pro. I gave the host 50% of the RAM, CPU, and GPU.
When I play a DirectX 12 game, the performance is as expected. However, when I try to play a DirectX 11, the performance is terrible and it seems like the GPU isn't being utilized for some reason.
I discovered this because God of War ran just fine in the VM (~60fps 1080p high), however a really light-weight game (Season) ran terribly (~20 FPS 1080p low). I narrowed it down to the biggest difference between the games being the fact that GoW is using DirectX 12 and Season is using DirectX 11.
To illustrate this, here are some screenshots from Fortnite, which helpfully lets you switch between DirectX 12 and 11.
DirectX 12: High 1080p, 120fps cap: https://i.imgur.com/oOsJF5x.jpg
- Game runs fine, consistently ~120fps
DirectX11: High 1080p, 120fps cap: https://i.imgur.com/di5M1li.png
- Gave up after 5 minutes of the game at 0-1 FPS just trying to render the menu, couldn't even get in the game.
I've tried googling this issue and haven't been able to find anything. Has anyone had this issue, and if so, do you have any idea how to solve it? Thanks!
Having the same exact issue. Edit 1: I made another VM with Win11. It seems to be an issue that is probably directly with Fortnite. Tested with Destiny, and seems to be working just fine. Edit 2: I managed to get Fortnite working with DX12, probably some compatibility issue with TSR (I moved everything to low or off) and it now has stable 60 or above frames. The issue seems to be with stuff working better on DX12 or DX11 depending on the game.
My rig has a ryzen 5 5600 and a RTX 2060 GPU, and after following the entire walkthrough, the heaven benchmark works just fine, getting a stable 60 FPS in its Extreme setting.
However, Divinity Original Sin 2 (Definitive edition) runs at 10 FPS. The classic version runs at full fps with no problems. Vagante, a game with extremely simple graphics, runs at 1-3 FPS with lots of artifacts. Caveblazers, on the other hand, works fine. Why?
Searching around on Craft Computing's video (https://www.youtube.com/watch?v=XLLcc29EZ_8), I found a comment that says that downgrading their driver version worked for the exact game I'm trying to run:
However, doing the same as him didn't do anything, which makes me think you need a particular GPU/Driver version combo for it to work (I did update the VM's driver too):
So I thought of making this issue a place for people to chime in and post which combination worked in the end. I'm still looking for mine...