helix-toolkit / helix-toolkit

Helix Toolkit is a collection of 3D components for .NET.
http://helix-toolkit.org/
MIT License
1.86k stars 662 forks source link

SharpDX GetBestAdapter method will not always get (The Best) one for .NET 4.5 #282

Closed cokemal closed 6 years ago

cokemal commented 8 years ago

After updating from Win7 to Win10, I had got white screen issues. None of the provided solutions worked for me including #224. I tried to figure out what was the exact reason for my issue. After debugging sessions, I found that GetBestAdapter method will not always bring me the "The Best" one. Because systemMemory and videoMemory variables are not calculated correctly when I use .NET 4.5 version of the SharpDX branch. I realized this after running .NET 4.0 samples (in which memory variables are calculated correctly.) Below you can see the screenshots from debugging session.

.NET 4.0 screenshot_40

.NET 4.5 screenshot

I solved this issue by changing the best adapter condition like this.

if ((bestAdapter == null) || (videoMemory > bestVideoMemory) && (systemMemory >bestSystemMemory)) { bestAdapter = item; bestAdapterIndex = adapterIndex; bestVideoMemory = videoMemory; bestSystemMemory = systemMemory; }

For your information.

JeremyAnsel commented 6 years ago

This was fixed in #164 and #508.

holance commented 6 years ago

Fixed

juyanith commented 6 years ago

For what it is worth I am still having this issue (10/25/2017). I have a laptop with an Nvidia and multi-monitor setup.

Changing line 135 in Effects.cs to:

if ((bestAdapter == null) || (videoMemory > bestVideoMemory) && (systemMemory > bestSystemMemory))

solves the problem.

holance commented 6 years ago

@juyanith Could you provide your adapter description details from f.Adapters?

juyanith commented 6 years ago

image

I traced though the code and the original check causes the Nvidia adapter to be chosen while the changed code causes the Intel adapter to be chosen. I would have thought the Nvidia to be the "best" choice so I'm surprised something about it is causing an issue. Maybe this is a driver issue?

EDIT: Some additional info that may help. The machine is a Dell laptop connected to a dock with a total of three monitors attached. The two side monitors are connected to the DVI ports on the dock and the center monitor is connected to the DP port on the laptop itself. DxDiag shows that Displays 1 and 2 (center and left) are using the Intel adapter and Display 3 is using the Nvidia. That seems like a strange arrangement but I don't know how to persuade Windows to use the Nvidia adapter for the main (center) display.

JeremyAnsel commented 6 years ago

With this code:

if ((bestAdapter == null)
    || (videoMemory > bestVideoMemory) && (systemMemory > bestSystemMemory))

the Intel adapter is chosen (first adapter).

With this code:

if ((bestAdapter == null)
    || (videoMemory > bestVideoMemory)
    || ((videoMemory == bestVideoMemory) && (systemMemory > bestSystemMemory)))

the NVIDIA adapter is chosen (best video memory).

Squall-Leonhart commented 4 months ago

The logic in this conditional might actually be breaking multiple dedicated gpu systems that have 3+ displays spread across the cards, but i have no way to properly test it myself.

https://github.com/TexTools/FFXIV_TexTools_UI/issues/56 https://github.com/TexTools/FFXIV_TexTools_UI/issues/222

the nvidia gpu at index1 may be chosen due to having little video memory used instead of the nvidia gpu at index0, the former not being the DWM attached gpu at any time.

holance commented 4 months ago

The logic in this conditional might actually be breaking multiple dedicated gpu systems that have 3+ displays spread across the cards, but i have no way to properly test it myself.

TexTools/FFXIV_TexTools_UI#56 TexTools/FFXIV_TexTools_UI#222

the nvidia gpu at index1 may be chosen due to having little video memory used instead of the nvidia gpu at index0, the former not being the DWM attached gpu at any time.

User can pass in their own adapterIndex during EffectsManager creation if they have such a complex graphics card configuration. I don't believe our developers have such gpu configuration to validate it either.