stride3d / stride

Stride Game Engine (formerly Xenko)
https://stride3d.net
MIT License
6.43k stars 929 forks source link

discrete GPU is skipped on laptops (with optimus dual-graphics?) #707

Open jeske opened 4 years ago

jeske commented 4 years ago

Release Type: Release Version: 4.0.0.1 beta2 0928 Platform(s): Windows

Describe the bug

On some laptops, Stride picks the slow integrated GPU even when there is a fast discrete GPU.

I believe this happens because on laptops with switchable graphics, only the "default" GPU has a display attached. Which is normally the integrated graphics. Stride checks for no display attached and skips that GPU in this code. Yet other game engines (in the user-report, Cryengine) still come up on the discrete GPU.

https://github.com/stride3d/stride/blob/273dfddd462fd3746569f833e1493700c070b14d/sources/engine/Stride.Games/GamePlatform.cs#L264-L269

I'll have to do more testing, but I think Stride should probably be using some method to get the default 3d device, instead of doing it's own picking algorithm based on the attached display. On D3D I believe this is to CreateDevice() with a D3DADAPTER_DEFAULT parameter. It could do this as part of the factory device enumeration, and then set some kind of default device flag in the enumerated device list.

To Reproduce Steps to reproduce the behavior:

  1. Start Stride on a laptop with optimus style dual graphics, with the integrated graphics as default
  2. WITNESS: Stride starts on the integrated graphics GPU
  3. start some other heavy 3d application or game (Cryengine, Unreal, Blender)
  4. WITNESS: that other 3d app starts on the fast GPU

Expected behavior Expected Stride to use the discrete GPU just like other games do.

joaovsq commented 4 years ago

We could do something like this (C++):

#ifdef _WIN32

// Use discrete GPU by default.
extern "C" {
// nvidia
  __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
​
// amd
  __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
} 
#endif

References:

Nvidia https://docs.nvidia.com/gameworks/content/technologies/desktop/optimus.htm

AMD https://community.amd.com/thread/169965

(EDIT) By the way, as a example, this is what Cryengine does.

tebjan commented 4 years ago

@joaovsq thanks for the hint. I've searched a little bit around and this seems to be very difficult to do from a managed executable. you have to do some IL hacks after compilation. i found this:

http://lucasmagder.com/blog/2014/05/exporting-data-symbols-in-c-for-nvidia-optimus/

and this:

https://stackoverflow.com/questions/17270429/forcing-hardware-accelerated-rendering

it might be possible since Stride is doing some assembly processing already.

but selecting the correct device in FindBestDevice might also just work...

jeske commented 4 years ago

I don't see anything in there for exporting constants.. here are are two references.

tebjan commented 3 years ago

i found the right setting, this should be set in the registry by the installer/launcher. just added it to the vvvv installer and it works like a charm. it even allows to user to change the setting later and doesn't force it:

https://www.trishtech.com/2018/12/how-to-set-graphics-performance-preference-for-apps-in-windows-10/