Open RetroEdit opened 3 months ago
Assuming you've bisected correctly I think this is wontfix.
I'm confident in my bisect, and wontfix would be extremely disappointing. This significantly impacts TAS workflows and it doesn't seem like it should be a necessary performance loss, considering it worked previously. For instance, this also affects turbo seek in TAStudio. There's not a convenient way to turn off the display in general (no hotkey currently and four clicks going through the UI each time), and even if there was, (Actually, the hotkey is General > Toggle Display
) I often want visual feedback at higher speed to see desyncs or changes and it feels silly to tap toggle display continually to compensate.
Does swapping display methods help with performance?
What you're reporting is very strange, in my testing D3D11 was much better in terms of performance. Maybe it could be worse maybe with particularly bad drivers, but that doesn't make much sense in the modern age where that's most likely going to occur with D3D9 if anything (due it it being so old and drivers practically just being compat shims)
I can not reproduce this, so it must be system/driver specific.
Results of swapping display methods for me (for BizHawk 2.9.1, the respective settings are Direct3D9
and Alternate VSync Method
instead of Direct3D11
and Allow Tearing
):
Display Method | 5ee53db53 | 2.9.1 |
---|---|---|
Direct3D11 (default) | 410-430 FPS |
1500-2000 FPS |
Direct3D11, Allow Tearing | 780-860 FPS |
1500-2000 FPS |
OpenGL | 1900-2300 FPS |
1800-2100 FPS |
GDI+ | 520-550 FPS |
900-1100 FPS |
I'll look and try on another computer and also look and see if my drivers are outdated or something? I'd be mildly surprised. I'm on an i5-1135G7 laptop with integrated Intel Iris Xe graphics currently.
Allow Tearing having more FPS is extremely strange. Logically due to how it works, it should inflict more work on the GPU, as it's forced to do every single frame rather than simply aborting if it was still busy with presenting a previous frame/not forced to present an actual frame outside of Vsync (what unchecking Allow Tearing does). This is I guess indicating some very very weird driver bs if anything. Your test does at least indicate OpenGL is performing more or less just as well, even outpacing D3D9 regardless (although GDI+ having a drastic difference seems fairly strange here, maybe something unrelated is causing that, idk).
(FYI, Alternative Vsync would do nothing if you are in fact not Vsync'ing)
Not sure there's much more I can say here at the moment. My graphics driver is up-to-date, but I still have the issue. On a different PC, I don't have the issue. I'm sort of curious why a newer Direct3D version would have an issue when dxdiag
says my local setup supports up to version 12 (Feature Levels: 12_1,12_0,11_1,11_0,10_1,10_0,9_3,9_2,9_1
), but I have little insight here. Maybe I'll eventually figure it out by tinkering with machine and graphics settings?
See title. Maybe this is intended behavior, but it had me scratching my head for an hour. Bisecting, it seems to have occurred at e293e023690990b326a594ff47b8f09cd7e2d388
Prior, in BizHawk 2.9.1 with display turned on and clock unthrottled, I could get ~2000 FPS in Gambatte, but in newer dev builds, I get ~420 FPS. Other cores like mGBA where previously I got 1000 FPS also seem to be bounded to this ~420 FPS.
Unsurprisingly, with
Config > Display... > Misc
set to "Absolute Zero", I get basically equivalent if not better performance in dev builds compared to 2.9.1.Host env.