Closed elisee closed 9 years ago
Intel drivers on Windows are pretty bad. Hybrid GPU drivers are even worse.
I've read about the app manifest, but I think that's for Android, rather than Windows. If those vendors have something they can use, they've done a very poor job of informing devs of the situation. But hey, to experiment with how irredeemably bad they are at their jobs, let's give something else a try:
Since they can't seem to detect when a GL context is made, they probably have something to detect specific games that would create a GL context and is possibly GPU-intensive to play. Ask the user to try renaming CraftStudio to something like doom3.exe or rage.exe, or some other OpenGL app that the driver authors probably thought of when writing the dynamic switcher.
Ask the user to try renaming CraftStudio to something like doom3.exe or rage.exe, or some other OpenGL app that the driver authors probably thought of when writing the dynamic switcher.
I asked one of those who reported the issue... and it worked. So basically they have a list of graphics-intensive process names somewhere in the driver. That's about as lame as the apps with the /Windows 9*/ regexes that prevented MS from having an OS named Windows 9.
Remember, kids: This is why Linux users get upset when you give them shit about "poor driver support" on their OS. :)
I seem to recall there might be some sort of manifest that we could ship with FNA on Windows to force the discrete GPU to be used instead of the integrated chip. I've searched for info about it but I haven't found any. Do you know if such a thing exists?
This article says that nvidia Optimus will look for a global exported integer variable (NvOptimusEnablement
) in the executable: http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf
I don't actually know anything about C# so I can't say how that can be done with FNA though...
Cool. So basically we'd need the C# equivalent of:
extern "C" {
_declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}
This thread says there's no equivalent to DllExport in C# but points to a roundabout way of doing it by decompiling into IL code, patching then recompiling. Not very pratical.
If a project wanted to use MonoKickstart on Windows then they could export it in the kick binary, but that in exchange for using Mono over .NET is pretty extreme. Maybe if the CoreCLR becomes something you can embed like libmono that'll make it feasible, who knows.
Apparently if you go to the NVIDIA Control Panel you can add a CraftStudio entry to the 3D settings that forces it to use the dedicated GPU. I've not tried this myself, but it'd make sense to have that in their overrides menu.
The issue with the dual-GPU machine choosing the integrated GPU instead of the dedicated one also happens with native XNA, I've had a lot of reports from players about it on Windows (even though it doesn't crash, it results in obviously terrible performances and the users are very unhappy and blame the game).
The only solution/hack that I've found (and plan to implement in a future patch) to actually detect the problem is checking this.GraphicsDevice.Adapter and read its ".Description" member to detect the GPU's manufacturer. If the card is an Intel one, then you can list all available GraphicsAdapter.Adapters and look for a dedicated GPU brand in there. If one is found, you can then show a warning screen to the player, asking to configure the game to run with the correct GPU in the GPU's Control Panel software. This is a terrible solution (and not future-proof) but I really can't seem to find any other way to tackle the problem, since we cannot choose the adapter in use.
However GraphicsAdapter .Description throw new NotImplementedException(); in FNA at the moment, so this is only possible for XNA builds.
The problem with that is that we cannot enumerate graphics devices with OpenGL. There may be some way to enumerate possible devices from WGL/GLX, but I would guess that it'd be an SDLGL* call if it were in any way trivial. The best Description you're going to get is whatever you get from glGetString, and that probably won't line up with the sort of thing you might find in D3D/XNA.
I suggest whining at the vendors, not at the GL.
If someone can verify that the NVIDIA Control Panel can override an FNA title to use the dedicated GPU I'd like to close this... it's stupid, but it's a stupid thing FNA can't really be responsible for.
I think this might cover it? http://nvidia.custhelp.com/app/answers/detail/a_id/2615/~/how-do-i-customize-optimus-profiles-and-settings
That works for me. I didn't know you could add an option in the right-click menu, surprised that's not on by default.
Closing as "drivers pls".
Context: I finally released a new version of my game-making app CraftStudio using FNA on Windows*, Mac and Linux. Huge thanks @flibitijibibo for making it possible :). Sharing binaries and building all the packages from the comfort of my Visual Studio IDE on Windows is awesome, as well as the many Linux and Mac improvements from the previous MonoGame version.
Obviously, if you ever want a copy of CraftStudio or three, let me know, I'll be happy to oblige.
(* The launcher, client & server manager app on Windows all use FNA but I had to back out for the game runtime and revert to XNA there because of some mysterious crashes with VertexBuffers on some computers. Assuming Windows OpenGL driver issues for now...)
Anyway. unrelated to the above mentioned crashes, I've had reports of people with an integrated chipset and a discrete NVIDIA GPU experiencing black screens or crashes until they forced the app to use the GPU. Even in the CraftStudio launcher which uses nothing but a
SpriteBatch
, so I believe this would affect most FNA games on Windows.I seem to recall there might be some sort of manifest that we could ship with FNA on Windows to force the discrete GPU to be used instead of the integrated chip. I've searched for info about it but I haven't found any. Do you know if such a thing exists?