schellingb / dosbox-pure

DOSBox Pure is a new fork of DOSBox built for RetroArch/Libretro aiming for simplicity and ease of use.
GNU General Public License v2.0
777 stars 65 forks source link

Some feedback on the latest features #519

Open PoloniumRain opened 4 months ago

PoloniumRain commented 4 months ago

Just had a look at the latest changes you've made, and obviously a lot of these new features might not be finished yet and could change before the next release, but...

  1. The new 3dfx Voodoo Hardware OpenGL option is huuuuge and massively improves performance :) I assume it works similar to DOSBox-X with glide passthrough enabled? There's also a Hardware OpenGL High Resolutionoption which seems to double the games native resolution. But i think it would be better if there was a separate resolution option for this, with 2x, 3x, 4x increases and so on. This would also make it clear to the user exactly what the setting does, and this is how other emulators deal with this feature (the DuckStation core is a good example). I can now run any 3dfx supported game and hit the games maximum frame rate with "High Resolution" 2x res enabled, so the performance headroom is there for much higher resolutions.

  2. Hardware OpenGL fixes significant visual issues like #316. I need to do more testing but it's looking great so far.

  3. The 3dfx Voodoo Gamma Correction option is really useful but by default it has incorrect gamma. There's roughly 30-ish DOS games that support 3dfx Voodoo and i've tried most of them. Every one of them now has incorrect gamma by default, so they look either too bright (most common) or too dark. Previously the gamma was correct in every game. So how about an "Auto" gamma setting that works exactly as things did before? Or have the current None gamma setting function this way. If there's some reason this can no longer be done with Hardware OpenGL, then i can create a list with the correct Gamma settings for every game and DBP could apply them on launch.

  4. Software Multi Threaded (default) and Software Multi Threaded, low quality run worse than before. Frame rates are cut in half and i get significant sound stuttering/popping. It's similar to when i tried 1 CPU thread in my testing. But Software Single Threaded runs the same as when i tested with 7 threads... Are these options incorrectly labelled?

  5. This only affects Hardware OpenGL - In Screamer 2, Screamer Rally and Carmageddon 2 the menu screens have the same issue as above, but not as severe and only with the menu screens. Seems like a possible threading issue but it's strange how (so far) it only affects the menus and never gameplay.

  6. Running a 3dfx accelerated game and switching from any of the Software Multi Threaded or Software Single Threaded options to Hardware OpenGL will show a black screen after restarting the game. To fix this the game needs to be closed and run again.

Anyway, just ignore any of the above problems if it's simply down to the feature being unfinished. Excellent work btw!

michael-leroy commented 4 months ago

Interesting. I wonder if this would fix the graphical issues I see in Diablo 2 in glide mode on iPadOS. Since that platform is so restricted I am guessing I won’t be able to test this for a long while. It would be cool if a dosbox-pure beta (from main/master branch) core was added to Retroarch for such purposes.

Things are sounding promising though!

schellingb commented 4 months ago

@PoloniumRain Thank you so much for the detailed feedback. It isn't enabled by default just yet because it is new. But were it not for feedback like this, it likely would have been released as it is now :-)

  1. The reason why we can't have arbitrary multipliers is because this DOSBox core has a fixed maximum resolution of 1280x1024 which is reported to the frontend (RetroArch) at start and we can't go past that. It will need some more under the hood changes to make this more flexible but I can look into it if you're interested. It would be neat to see even higher resolutions :-)
  2. I don't know why the texture level-of-detail issues exist in the software renderer, it's code inherited from the original authors of the 3dfx emulation. I've noticed it not reappearing with OpenGL and I'm glad but ideally software rendering would also be fixed but I don't know how...
  3. Before games were not gamma corrected at all so they really didn't look like original. In DOS games, the default value of SST_GAMMA is at 1.7. This can be changed in the command line with SET SST_GAMMA 1.0 to disable gamma correction. If you were to do that, and set the core option to "None" it should look like it looked before. The default 1.7 seemed way to bright so I ended up defaulting to -0.2.
  4. Oh, you're totally right, the flag check for single threaded and multi threaded was flipped in code by mistake. I fixed it now.
  5. The same issue meaning performance issue or texture issues? Rendering 2D in Voodoo with OpenGL is actually very tedious because every drawn pixel is a command that has to be translated to GL. I think full screen 2D screens can be slower on OpenGL than in software rendering due to that. We're already batching all pixel draws into a single OpenGL draw but it still needs a lot of processing to get there.
  6. RetroArch should show a little notification popup saying "To enable OpenGL hardware rendering, close and re-open.". This is needed because at startup we tell the frontend if we want hardware accelerated rendering or not. If 3dfx is set to the default (software rendering), we don't request hardware rendering and thus we are limited to 2D functions. For now this is to not break any existing compatibility (it runs on devices that don't have OpenGL). In theory we could always request the 3D context but I'm conflicted a bit. If only a small percentage of users use 3dfx, it seems an unnecessary overhead...

@michael-leroy I really wanted to try Diablo 2 but I couldn't get it to run in DOSBox Pure. The only way I got Diablo 2 running was with software rendering in windowed mode. Can you tell me how I could get it running in glide mode? Any special version or patches? Hopefully new release soon!

michael-leroy commented 4 months ago

@schellingb It took me forever to get D2 working. The key is to use version 1.05b with a no-cd patch. You can get the needed files from this project:

https://github.com/ChaosMarc/D2VersionChanger

I didn't use the .bat file I just installed the game (full install) and copied the classic/1.05b files and then the no-cd game.exe. Software mode does run fine it seems. I didn't have to use window mode. Trying to use later versions makes it explode with an illegal operation usually due to fog.dll.

In glide mode the sprites are missing with white blocks around them, but the game does actually start and run. I can post screen shots/video later if need be. I tried several voodoo driver versions with no luck. At one point messing with some settings in the registry I was able to get the background rendering properly in glide mode, but almost all the sprites were missing.

Here is a video I do have of it running in software mode: https://youtu.be/2s7sPFEUKrQ

This is with an M4 ipad pro.

While on this subject do you know what is going on with mouse grab/lock in iPAD os with dosbox-pure? Having the system cursor and the windows 98/game cursor at the same time is less than ideal. Makes it really hard to play games like D2 and say Quake with a mouse. Seems like a upstream Retroarch problem. A bug has been open for about a year with no progress..

michael-leroy commented 4 months ago

Here is the video of D2 starting and running in glide mode:

https://youtu.be/QShKAzrYmWE?si=j_Q3tRD3MMmu3I2r

Let me know if you want me to file a separate issue/ticket on this. I don’t want to crowd/distract from OP’s report.

schellingb commented 4 months ago

Interesting... I wonder why it's so broken. I wonder if this would run correct in something like PCem.

PoloniumRain commented 4 months ago

but I can look into it if you're interested. It would be neat to see even higher resolutions :-)

Yeah definitely!!

  1. Before games were not gamma corrected at all so they really didn't look like original. In DOS games, the default value of SST_GAMMA is at 1.7. This can be changed in the command line with SET SST_GAMMA 1.0 to disable gamma correction. If you were to do that, and set the core option to "None" it should look like it looked before. The default 1.7 seemed way to bright so I ended up defaulting to -0.2.

This is what i'm seeing...

Gamma_Comparison

You can see that with DBP 0.9.9 the 3dfx version has gamma that closely resembles software rendering, which has correct gamma. But now it's way too bright and i have to lower gamma to -7 to get it to match DBP 0.9.9. Every 3dfx game is like this to some extent but Blood is one of the more obvious examples. I remember many games were like this on real hardware, where some were too bright/dark and i'd have to adjust my monitor or Windows settings to fix it, which was always a pain, but the great thing about previous DBP versions is that i never had to bother with this. Every game always had correct gamma.

So now with DOS 3dfx games i can at least fix the gamma by adjusting the DBP gamma setting for each game and then save my core options settings with RA. But the biggest problem here is with something like Windows 98 where the DBP gamma setting obviously cannot be configured for individual games running inside Windows.

But somehow DBP 0.9.9 and older versions have correct gamma for each Windows 98 game that i've tried so far (only around 8)... Which is weird! Because i've no idea how it could get the gamma correct for 3dfx Windows games when some of these games will need separate gamma adjustments on real hardware. But now the gamma is too bright or dark with the default -2. For example Motorhead and Pod are too bright, but Carmageddon 2 almost looks correct, which is how i'd expect real hardware to work. Anyway all i know is that the previous gamma implementation worked extremely well for me, so can you add that back as an "Auto" setting or something? Btw i'm on Windows 11 with an RTX 3080 and haven't changed any RA or driver display settings that could be affecting my gamma.

  1. The same issue meaning performance issue or texture issues? Rendering 2D in Voodoo with OpenGL is actually very tedious because every drawn pixel is a command that has to be translated to GL. I think full screen 2D screens can be slower on OpenGL than in software rendering due to that. We're already batching all pixel draws into a single OpenGL draw but it still needs a lot of processing to get there.

Sorry for not being clear, i meant it's a performance issue. One strange thing is that GTA is 2D but runs totally fine with Hardware OpenGL, and i've just noticed with Screamer 2 on the very first screen that appears where you select your team, it has frame rate issues and sound stuttering due to it running too slow, but this screen only shows four 3D models of the team logo's. It's a very simple scene with low complexity. This same screen runs perfect with DBP 3dfx Software rendering but Hardware OpenGL has better frame rates during gameplay, which has far higher scene complexity. Makes no sense.

Oh and Screamer Rally runs way too fast during gameplay with Hardware OpenGL, but this likely isn't a problem with DBP. Screamer Really has always had issues with the game speed being tied to the frame rate. So now that Hardware OpenGL produces far higher frame rates it seems to have sped up the gameplay. It's best to use DBP 3dfx Software rendering for this game. I'm only mentioning this because i'm sure someone will eventually make a post saying how Screamer Rally isn't working properly...

  1. RetroArch should show a little notification popup saying "To enable OpenGL hardware rendering, close and re-open.".

In theory we could always request the 3D context but I'm conflicted a bit. If only a small percentage of users use 3dfx, it seems an unnecessary overhead...

Ah i didn't notice that notification even though it's really obvious! lol. Ignore this one then :)

schellingb commented 4 months ago

Anyway all i know is that the previous gamma implementation worked extremely well for me, so can you add that back as an "Auto" setting or something?

The problem is, previously we had NO gamma implementation. It showed the raw color data as rendered. completely unadjusted. This means things like fade-in/fade-out (for example used by the spinning 3dfx logo on game startup) doesn't work. Games like Quake 3 Arena use a special gamma curve to make fire and stuff pop out which resulted in very flat, dull colors before.

I was close to having 2 options "Gamma Correction in DOS" and "Gamma Correction in Windows 9x" but it seems that wouldn't be enough.

We can have a "disabled" option for gamma correction, but I'm not sure this is really what we want... The issues above are quite severe in my opinion.

Without gamma correction: image

With gamma correction: image


strange thing is that GTA is 2D but runs totally fine with Hardware OpenGL, and i've just noticed with Screamer 2 on the very first screen that appears where you select your team, it has frame rate issues and sound stuttering due to it running too slow, but this screen only shows four 3D models of the team logo's

GTA still draws the world with (rather large) polygons. The problem in screamer isn't the 4 team logos, it's the background (307200 pixels) and text (17737 pixel) which is a full screen drawing where each pixel is processed and converted for OpenGL rendering one-by-one. The Voodoo API isn't set up that the game can prepare a bitmap, it has to send a draw command for each pixel. For us this means drawing 2D on the entire screen (at 640x480) is similar to the game drawing 307200 polygons. There might be means of catching large pixel drawings somewhere in the process and treating it special, but it won't be easy...

Just to be clear, some games draw images like that, some don't. For example Tomb Raider has a full screen image on the title, but it choses to draw it with 12 or so large polygons, which of course is much faster.

schellingb commented 4 months ago

I added a bit of optimization to pixel drawing. It isn't magnitudes faster but hopefully noticeable. @PoloniumRain If you have some more time for testing, can you give the latest commit a shot?

PoloniumRain commented 4 months ago

This means things like fade-in/fade-out (for example used by the spinning 3dfx logo on game startup) doesn't work.

For me i get the opposite result... The 3dfx logo works properly in 0.9.9, but it doesn't fade in with Hardware OpenGL, it's very dark then instantly gets lighter...

https://github.com/user-attachments/assets/e48a15b3-f561-40b1-9e69-75a44f924c76

But with 3dfx Software it still works correctly. It might be a different logo to the one you're talking about though, because there's two different spinning 3dfx logo's. But either way that logo in Motorhead doesn't show the fade-in effect with Hardware OpenGL. The other spinning 3dfx logo looks the same in 0.9.9 or with the latest changes.

Games like Quake 3 Arena use a special gamma curve to make fire and stuff pop out which resulted in very flat, dull colors before.

I also don't have the same issue as in the images you posted. What version of Quake 3 is that because the font is different and the text is green? I have an original Quake 3 that has been updated with the official patch 1.32. It looks like this...

Q3 Comparison

But to get Q3 to even run i had to use a driver file named OpenGL.dll and place it in the game directory. So maybe i'm getting different lighting results because of that, because i can't replicate that flat/dull look in the image you posted no matter what graphics setting i change in the game. This is my current settings...

Q3 Settings

it's the background (307200 pixels) and text (17737 pixel) which is a full screen drawing

Ohh that explains it! Pretty obvious now that i look at it again...

I added a bit of optimization to pixel drawing. It isn't magnitudes faster but hopefully noticeable. @PoloniumRain If you have some more time for testing, can you give the latest commit a shot?

That was quick! I'll test it tonight...

michael-leroy commented 4 months ago

I recall in the ancient days of yore some 3dfx games were too dark and I had to boost gamma. I think it being wildly inconsistent is fairly accurate to the real hardware?

RE: D2, can you guys reproduce the graphical glitches I see in the latest version with OpenGL enabled? When I get around to it I will try it on a Windows PC instead of iPadOS. What I wonder is if D2 never actually ran correctly on a voodoo 1. That is one thing that is a bummer about emulating a V1, the drivers are much older and are fairly ancient by the time Diablo 2 came out. It would be really nice to instead have Voodoo Banshee emulation. Not as demanding to emulate as a full voodoo 2, but has newer drivers.

PoloniumRain commented 4 months ago

Ok i tested the latest commit but sadly it doesn't seem to make any difference. The frame rates are identical on the menu screens.

@michael-leroy Just gave D2 a go and can't get it to work with 3dfx either, including with the new Hardware OpenGL. Like you say i think the Voodoo is just too old. From what i can find it definitely supports Voodoo 2 and 3, but probably not Voodoo 1. This is how it's looking for me with completely missing sprites....

D2

Your best bet is to use PCem, which supports up to Voodoo 3 (but is only available for Windows and Linux). It has far better Windows support as well because DOSBox was never really intended to run Windows. There's a good chance D2 will work, but PCem focuses heavily on accuracy, so it requires way faster CPU's than DOSBox. To emulate even the minimum required CPU for Diablo 2 (Pentium 233MHz) you'll need a CPU at least equivalent to an AMD Ryzen 3600X.

PoloniumRain commented 4 months ago

Found a fix for the menu performance issue! It happens when Emulated Performance is set to either AUTO or MAX.

For Screamer 2, Screamer Rally and any other games with this problem, if i set the Emulated Performance to 300MHz it will now run the menus perfectly (solid 60fps). If i go up to around 600MHz and beyond i will get frame rate drops and sound stuttering again, exactly the same as with AUTO. So it seems to be caused by the CPU cycles being too high on AUTO when using Hardware OpenGL. How many threads does Hardware OpenGL use?

schellingb commented 4 months ago

OpenGL doesn't use any threads. Well, your graphics card driver might but that is out of our control. What OpenGL does though, unlike software rendering, it makes the main thread (of RetroArch) much busier than ever before. So maybe that causes the emulation thread to have to wait for the main thread which usually never happens. The AUTO/MAX CPU cycle scaling might just not work well with it.

michael-leroy commented 4 months ago

Ok i tested the latest commit but sadly it doesn't seem to make any difference. The frame rates are identical on the menu screens.

@michael-leroy Just gave D2 a go and can't get it to work with 3dfx either, including with the new Hardware OpenGL. Like you say i think the Voodoo is just too old. From what i can find it definitely supports Voodoo 2 and 3, but probably not Voodoo 1. This is how it's looking for me with completely missing sprites....

D2

Your best bet is to use PCem, which supports up to Voodoo 3 (but is only available for Windows and Linux). It has far better Windows support as well because DOSBox was never really intended to run Windows. There's a good chance D2 will work, but PCem focuses heavily on accuracy, so it requires way faster CPU's than DOSBox. To emulate even the minimum required CPU for Diablo 2 (Pentium 233MHz) you'll need a CPU at least equivalent to an AMD Ryzen 3600X.

Your image matches what I got at one point trying different voodoo drivers and registry settings on iPadOS. I do suspect that the game never worked right on a real Voodoo 1 either (EDIT: The game runs fine with a voodoo 1 under PCem). I completely understand this may be out of the scope of the project, but running D2 in glide mode on the iPad would be so rad. Especially if hw openGL actually works on iOS/iPadOS. (One day..)

I know I can just use a PC. Heck I can just play the remastered version of D2, but AFAIK the only way to play D2 on the iPad is via Dosbox-Pure. The novelty is what makes it cool. It’s actually really awesome that the game runs in Direct2d even without JIT!

UTM SE is wayyyyy too slow to even attempt without JIT, even with a M4 CPU. I might try using PCem (on windows) if it has a Voodoo 1 to see if it’s broken in a similar way.

I am not totally familiar with all the dosbox forks, but is emulating a Voodoo Banshee out of the question? This card had much newer 3dfx drivers that are less likely to have bugs in various win 95/98 games. It’s essentially a crippled V2 so perhaps easier to emulate?

schellingb commented 4 months ago

I think emulating a Voodoo Banshee in DOSBox would be quite involved. A lot of its normal VGA rendering very much is built on top of its VGA code which can emulate an S3 card among a few others. Unlike Voodoo 1 and 2 which were separate, 3D only cards, implementing something like a Banshee in DOSBox is probably not an easy endeavor. It feels like Voodoo 2 is the limit for what we have. Voodoo 2 code is mostly in there but disabled in some places because I got crashes in Windows. Perhaps this could be fixed.

Edit: I tried to enable Voodoo 2 support and the code that is there seems just unfinished. Bugs I can maybe work around, but if the emulation in the DOSBox Voodoo emulation is unfinished, there isn't much I can do.

PoloniumRain commented 4 months ago

I did some testing with Quake II and uncapped the frame rate limit, then used the games built-in "timedemo 1" benchmark. This is the most demanding DOS game but performs particularly well with Hardware OpenGL + AUTO (no menu performance issues) and is a good example of just how much performance can be gained even when the res is doubled:

3dfx Software @ 640x480 Hardware OpenGL High Resolution @ 1280x960
28 FPS 98 FPS

There's also no difference AT ALL between 640x480 and 1280x960 when using Hardware OpenGL and Hardware OpenGL High Resolution, not even a 1 FPS change. The bottleneck is 100% CPU. This is on an RTX 3080, but i think if there was higher resolution scaling options then even 6x scaling (slightly over 4K) would run fine on any remotely decent GPU, maybe even some integrated graphics.

schellingb commented 4 months ago

Great numbers! Happy to see :-)

Keep in mind though that something like a timedemo or fastforward in the frontend is cheating a bit because it will end up not actually fully rendering all these frames. The emulation will prepare drawing of these frames by queueing up all the draw commands executed on the 3dfx card, but only when the frontend is asking for a new frame will the latest batch of draw commands actually be executed and (3D) rendered to be shown on screen. The software emulation does not have that freedom and will always make sure all 3D rendered images will be finished in the video cards memory at some point. This is actually important for some games that want to read back the rendered image and can lead to issues when using OpenGL where the resulting image is always delayed (an example is Extreme Assault which queries the depth buffer to figure out where to show target markers).

Scaling the output should have a small impact on a powerful desktop GPU. On an Android phone it does have quite an impact because a mobile GPU is very often fill rate limited. Same results on a Raspberry Pi.

michael-leroy commented 4 months ago

I think emulating a Voodoo Banshee in DOSBox would be quite involved. A lot of its normal VGA rendering very much is built on top of its VGA code which can emulate an S3 card among a few others. Unlike Voodoo 1 and 2 which were separate, 3D only cards, implementing something like a Banshee in DOSBox is probably not an easy endeavor. It feels like Voodoo 2 is the limit for what we have. Voodoo 2 code is mostly in there but disabled in some places because I got crashes in Windows. Perhaps this could be fixed.

Edit: I tried to enable Voodoo 2 support and the code that is there seems just unfinished. Bugs I can maybe work around, but if the emulation in the DOSBox Voodoo emulation is unfinished, there isn't much I can do.

Luckily it seems that D2 runs fine on a Voodoo 1. I setup PCem with a 2d card and a Voodoo 1. Using the 3.01.00 drivers, Diablo 2 1.05b runs great in glide mode. So the problem lies in the voodoo emulation on dosbox-pure.

@PoloniumRain Its really cool to see Quake 2 performance improve so much!

PoloniumRain commented 4 months ago

Keep in mind though that something like a timedemo or fastforward in the frontend is cheating a bit because it will end up not actually fully rendering all these frames.

Ah right, but i've found with Q2 that the timedemo is surprisingly representative of gameplay performance. If i start a new game, in the very first room i get almost identical frame rates to the timedemo on the same map. I've now ran through roughly 30% of the game and it's been great, typically around 80 - 130 FPS, sometimes going over 200 FPS! Although of course the Voodoo can't even display that many frames per-second.

With Hardware OpenGL i wonder if there would be some way around the Voodoo refresh rate limitation, so that DBP outputs as many frames as the game is rendering? Like any non-DOSBox game would typically work. Obviously it wouldn't be accurate to real hardware but that could also fix the issue with Variable Refresh Rate not working correctly.

leilei- commented 4 months ago

Somehow I knew the emulation of the clut tables would lead to the gamma shock like it's a new bug. There's a potential "always has been" astronaut meme here.

On real 3dfx hardware, everything was so gamma boosted, this gaslit competitors as being 'too dark' back in the day! The gamma's there likely from the arcade roots, where large CRTs for cabinets then really needed the correction (i.e. Konami's 1987-1993 games). Coincidentally it was also handy for users of S3 video cards which had a forced brightness boost that required monitor adjustment, so the 3dfx gamma boost was well compensated for that. Should also be noted Windows 95+ didn't have color ramps standardized for video drivers in these years.

Later Voodoo hardware (Voodoo2, Banshee/V3) would tone down the default gamma to 1.3.

It's very dark then instantly gets lighter...

that's how it is on the real thing too. The STB-era splash never had an instant appearance.

But to get Q3 to even run i had to use a driver file named OpenGL.dll and place it in the game directory. So maybe i'm getting different lighting results because of that, because i can't replicate that flat/dull look in the image you posted no matter what graphics setting i change in the game. This is my current settings...

The proper procedure is installing the latest drivers (The May 1999 "Quake3 certified" drivers, also available in GLSetup), and having Quake3 use r_gldriver 3dfxvgl. It should see the proper 3dfx gamma extension and send proper color ramps to the card to bring in the intended lighting range. This also overrides the 3dfx gamma setting, so it should look consistent with contemporary 3d hardware. The version in the screenshots is the 1.09 Q3DEMOTEST which shouldn't have any renderer differences from the final game and it's mostly cosmetic, 1-2 weeks behind gold date.

schellingb commented 4 months ago

Thanks for the insight @leilei- ! I'm fine with adding a "Disable Gamma Correction" selection to the "3dfx Voodoo Gamma Correction" core option, it would be a small change and would bring back to what it was before. But the default will probably stay like it is, maybe we go one lower? Not that it would help much...

PoloniumRain commented 4 months ago

So... I've just tested the new resolution scaling options on a RTX 3080...

QUAKE II Resolution scale Frame rate
1x (640×480) 98
2x (1280×960) 98
3x (1920×1440) 99
4x (2560×1920) 99
5x (3200×2400) 100
6x (3840×2880) 100
7x (4480×3360) 101
8x (5120×3840) 99

LOL no performance loss even at 5120×3840 .... I've double checked multiple maps and it's the same frame rates with any scaling setting. It's definitely working properly, at least up to 6x (4K) because that's the highest resolution displays which i have to test with, so 8x looks the same as 6x to me. Although the DBP Performance Statistics shows the resolution as 640x480 (it definitely isn't).

Maybe add up to 12x for 8K support? I know that's a little absurd but i doubt the frame rate would change much... I'm sure that even crappy Intel integrated graphics would be fine with 4x or higher, as long as the CPU is fast enough being as that's the real bottleneck here. With this much GPU performance headroom you could implement SSAA (supersampling). DuckStation core has this feature and it will provide the most perfectly anti-aliased image possible, better than any modern AA techniques.

Btw it's great seeing these old games at such high resolutions :) Quake II is looking like the recent remaster.

Oh and Screamer Rally crashes back to the DOS command prompt screen with Hardware OpenGL, typically within 1 minute of racing. So far it's been stable with 3dfx Software rendering, and i know it's stable with 0.9.9 because i completed the Championship mode on that version.

@leilei- Thanks, i'll give Q3 a go with those drivers/files.

michael-leroy commented 4 months ago

Will oGL hardware acceleration work in iPadOS? These performance numbers are pretty rad on PC.

schellingb commented 4 months ago

Found a fix for the menu performance issue! It happens when Emulated Performance is set to either AUTO or MAX.

For Screamer 2, Screamer Rally and any other games with this problem, if i set the Emulated Performance to 300MHz it will now run the menus perfectly (solid 60fps). If i go up to around 600MHz and beyond i will get frame rate drops and sound stuttering again, exactly the same as with AUTO. So it seems to be caused by the CPU cycles being too high on AUTO when using Hardware OpenGL.

Have you noticed any improvement with this since the last commit? It is an attempt that I'm not quite sure is correct but it does seem to help. But it also reduces the emulated CPU's speed a bit while using OpenGL Voodoo rendering (to avoid the stuttering).

PoloniumRain commented 4 months ago

Have you noticed any improvement with this since the last commit? It is an attempt that I'm not quite sure is correct but it does seem to help. But it also reduces the emulated CPU's speed a bit while using OpenGL Voodoo rendering (to avoid the stuttering).

Yeah the last commit definitely helps :) For example the menus in Screamer 2 no longer have any problems until i go up to 5x scaling, where as previously they would have issues with no scaling at all. Other games have similar results to this. But it does have a negative effect with Quake II, where i've lost about 15 FPS. There's some areas in certain maps where it will now dip under 60fps. The frame rate drop isn't visibly noticeable with any other games because they're all far less demanding, so if i uncap their frame rates they're still in the hundreds of FPS. So overall i think it works well.

Maybe DBP could detect Quake II at launch (q2.exe) and not apply the latest commit fix to just this game? It was the one game where it had zero issues with AUTO anyway. I've tried manually adjusting the cycles but nothing works quite as well as AUTO did before this commit.

PoloniumRain commented 4 months ago

Problems i've encountered with Hardware OpenGL so far:

Since my last post i've now tried many more 3dfx supported games on both DOS and Windows. So far with the latest commit i can't find a single game that has menu performance issues with 1x - 4x scaling. Some encounter emulation slowdown with 5x on the menu screens (audio stuttering), while others are fine at 8x. It's made a big improvement! Again the ONE exception being Quake II which ran better previously.

PoloniumRain commented 4 months ago

The latest commit from a few hours ago works great. One example, Screamer 2 can now be scaled to 8x resolution without any audio stuttering on the menus! It's fixed stuttering for all the games i've tried, with any level of scaling. It's come a long way :)

Btw some games will show graphical corruption at the bottom of the screen after changing the scaling to 3x or higher when the game is running. This happened with every commit i've tried but forgot to mention it.

Some examples...

Pyl:

gfx_corruption

QDOS (DOS Quake with 3dfx Voodoo support and extra features):

gfx_corruption2

Ignition shows a similar problem:

gfx_corruption3

Going back down to 1x scaling wont fix it, but sometimes it will go away by exiting the level and loading it again. Or closing the game and running it again will always fix it.

PoloniumRain commented 4 months ago

The graphics corruption with scaling is a bigger problem than i first thought. It also happens with many games that don't have 3dfx Voodoo support as long as Hardware OpenGL + scaling is enabled. Closing and running the game again will not fix it in these cases. Hardware OpenGL needs to be disabled or scaling set to around 3x or lower. Obviously scaling wont work on these games anyway.

Edit: I wonder if this has anything to do with the issue that causes the Windows 98 desktop to not appear when the resolution exceeds roughly around 2560x1920. Some games, whether they have 3dfx support or not, will have issues when going too high, while others are fine at 8x.

Here's Destruction Derby, Screamer and Wipeout with 8x scaling:

gfx_corruption_software

schellingb commented 4 months ago

Can you provide more information how to reproduce this? Does this require first running a game with OpenGL and then without closing the content switch to another game? Is this just in Windows 9x? Does this require switching the scale factor while the core is running or does it happen when it is at > 3x from the start?

Can you enable the statistics in RetroArch via Settings -> User Interface -> On-Screen Notifications -> Notification Visibility -> Display Statistics and tell me what is shown for "Size:" and "Max Size:" when things are weird?

Edit: I think I figured it out :-)

PoloniumRain commented 4 months ago

Edit: I think I figured it out :-)

Commit 6975ea7 completely fixes the problem with all games that use a software render engine :)

But one small issue is the graphical corruption persists with some DOS games that support 3dfx Voodoo acceleration (i've only found 2 games so far, so it's not a big problem). Both Pyl and QDOS are DOS based games that will show graphical corruption at the very bottom of the screen.

To reproduce this:

  1. Run Pyl or QDOS with 3dfx Voodoo acceleration with OpenGL enabled in DBP. Make sure scaling is set to 1x.
  2. Start playing either game (get off the menu screens and into actual gameplay).
  3. While the game is running increase the scaling to 8x.
  4. You should now see graphical corruption at the bottom of the screen.
  5. Press the escape key to bring up either games in-game menu screen.
  6. The graphical corruption will now disappear (anything that updates the affected area of the screen will fix the issue).
schellingb commented 4 months ago

It looks like these are areas of the UI that only update when something changes. We can't really force the game to redraw something it thinks is already on screen, so I added rescaling of the image when changing the scale factor. It can end up looking a bit blurry but it's certainly much better than before.

PoloniumRain commented 4 months ago

Nice! Clever way of fixing it. I'm not seeing any blurring at all though? But i'm relieved there isn't any!

Btw i've found why Windows 98 games were showing a black screen with high scaling values. It's caused by the RA shader i was using. It only happens with this one shader so far. This is how to reproduce it:

  1. Run Win98 with 'Hardware OpenGL' enabled.
  2. Enable the shader sharp-bilinear.slang (located in RetroArch\shaders\shaders_slang\pixel-art-scaling\shaders).
  3. Run any Win98 game.
  4. Increase scaling to 5x or higher.
  5. The entire screen should now turn black (i know for sure this always happens with Carmageddon 2 during gameplay).

The reason i use this bilinear shader is because it fixes the ugly Windows 9x text rendering on LCD/OLED displays without looking too blurry. But i've found that this shader doesn't even work any more when OpenGL is enabled. It only works when '3dfx Voodoo Performance' is set to Software. But this is just a small issue and i'm hesitant to even mention it. It might just be a problem with this shader.

And here's how to reproduce an issue with Descent II displaying graphical issues with Hardware OpenGL:

  1. Run the ExoDOS version of Descent II with OpenGL enabled. Any scaling setting can be used.
  2. After running the included RUN.BAT file and choosing the 3dfx acceleration mode, you will see a black screen (it's meant to show green text).
  3. If you press escape you can get passed the black screen to the game menus.
  4. Start a New Game and some HUD elements will be missing or show black boxes.
  5. A good example is if you press F1 while playing, it will show a pop-up menu with black boxes instead of text.

However if you switch to 3dfx Software then all of these screens and graphical elements will work correctly. OpenGL specifically seems to have a problem rendering the text in this game.

schellingb commented 3 months ago

Descent 2 might be a lost cause with how we're doing OpenGL. I hinted at this before with

The software emulation does not have that freedom and will always make sure all 3D rendered images will be finished in the video cards memory at some point. This is actually important for some games that want to read back the rendered image and can lead to issues when using OpenGL where the resulting image is always delayed (an example is Extreme Assault which queries the depth buffer to figure out where to show target markers).

So what Descent 2 3DFX does is, it renders part of an image and then immediately reads back the image it thinks was drawn. It then builds the HUD graphics with what it "sees" on screen. This works in software but in OpenGL things are much more asynchronous. The draw commands are queued up and sent to the GPU later. And the way hardware rendering works nowadays is that the graphics card also does the same, it receives the commands and in the background starts drawing while the program continues. So only later once the image has actually been rendered out and shown on screen we can proceed to read it back without stalling the program. So that's why Descent reuses "old" HUD graphics when drawing. One example is by selecting "Credits" on the main menu and then pressing escape to get back. It will show the background of the credits as the background of the main menu.

PoloniumRain commented 3 months ago

Descent 2 might be a lost cause

Yeah you say that but now you've gone and fixed it already! Don't know how you always figure these problems out so fast...

The only minor thing i've found are black rectangles behind some of the text on the screen that appears after completing each level (but all of the text is still fully readable). It's insignificant. As far as i can see the gameplay no longer has any graphical issues and that's what matters most.

schellingb commented 3 months ago

Ah, yeah I fixed some things but not everything is what I meant :-)

Btw. the issue I fixed should have been logged for you, too, if you enable core logging in RetroArch. Next time you see something weird you can look at the log and see if it logs something like [DBP:GL] Error compiling shader with a bit of information on what failed. This can either mean missing hardware support (on something like an older phone) or, as in this case, a code mistake.

I haven't said it recently, but thank you so much for all the testing, this feature is already infinitely better due to it. Really appreciated!

leilei- commented 3 months ago

Incoming on Windows 98 - Runs too fast. Can be somewhat fixed by setting Emulated Performance to around 12000 cycles.

Real behavior too, the final game should have a frame limiter to keep it at 30. though back in the day (earlier in 98 on earlier oem versions) this wasn't foreseen as much when a good p2 233 w/ riva128/v1 couldn't blow through it at high frames

PoloniumRain commented 3 months ago

Really appreciated!

No problem :)

see if it logs something like [DBP:GL] Error compiling shader

Just tried this but didn't get any errors so far...

Btw, here's how to reproduce the Screamer Rally crash with OpenGL that i mentioned previously. This one is really simple:

  1. Run Screamer Rally in DOS with 3dfx acceleration and with Hardware OpenGL enabled.
  2. Start any race and within 1 minute the game should crash back to the DOS prompt screen.

However, if you just sit at the starting line and don't play the game, it wont crash. It only crashes shortly after you start controlling the car and on the same areas of each track. It will also crash within 30 seconds if you run the game and watch the demo that appears immediately after the opening logo screens. This games only crashes with OpenGL btw. Here's the core log file but can't see any obvious problems in it... log.zip

And i also found that Ignition crashes with both Hardware OpenGL AND 3dfx Software. To reproduce it:

  1. Run Ignition in DOS with either Hardware OpenGL or Software enabled (run IGN_DOS.BAT if using the eXoDOS version).
  2. Choose any track in Time Trial and usually within 4-ish minutes of play the game will crash back to the DOS prompt screen.

There's lots of illegal read errors in the Ignition_log.zip and a "voodoo readd unaligned" notification appeared (spelling error?). Being as the 3dfx Software core option also crashes, maybe it's a problem with the game, but I haven't had time to test if it works on PCem. Just wanted to mention this because it's a pretty good game and would be nice if it was playable.

the final game should have a frame limiter to keep it at 30. though back in the day (earlier in 98 on earlier oem versions) this wasn't foreseen as much when a good p2 233 w/ riva128/v1 couldn't blow through it at high frames

Thanks again @leilei-, i've now found a version of Incoming with the frame rate limiter. Works perfectly!

schellingb commented 3 months ago

Screamer Rally crashes exactly the same for me with both OpenGL and software renderer. Are you sure it's OpenGL only for you? Both the intro demo and playing the game crashes at the same spot in both modes.

Also, can you give srdemo.zip from https://www.dosgamesarchive.com/download/screamer-rally a try? Getting no crashes with that. It shows the same intro demo and same tracks. Maybe it's a newer version? Is there a patch?

PoloniumRain commented 3 months ago

You're right... That's soo weird... Because about a month or two ago i fully completed Screamer Rally and i'm kinda sure i played the 3dfx accelerated version. It was with DBP but didn't have any crashes.

And yeah that demo doesn't crash for me either. It ALSO doesn't run too fast like the eXoDOS version does, which is really bad when using OpenGL, but also has problems with 3dfx Software. Which is also weird, because many times in the past i've looked for patches that could fix this issue with the games speed, but never found anything. You might have just found a fix that nobody knows about lol. I'm going to try getting the demo files working with the full game...

PoloniumRain commented 3 months ago

Found the real problem with Screamer Rally crashing... It's the setting 3dfx Voodoo Emulation > 12MB memory in Core Options.

12MB is the default setting and it makes the game crash, but i've never seen this happen before with any other game. SR is totally stable when i change this setting to 4MB memory.

Maybe it would be best to make 4MB the default setting? I could be wrong, but didn't all Voodoo cards only have 4MB, so would 12MB even do anything with games, like would it actually be recognised? I know that 3DMark 99 on Windows 98 will recognise 12MB... kind of. It says there's 4MB of video memory when it's set to 12MB, and then says there's only 2MB of video memory when it's set to 4MB. Strange.

Edit: Found this interesting video with a Voodoo 8MB mod. Increased video memory allows some games to run at 800x600 (and i just confirmed this with Motorhead) and it also improves performance a little in some cases, but at 13:39 he mentions there's reports of some games not working with the increased memory. Maybe add this info to the description underneath the 3dfx Voodoo Emulation setting?

michael-leroy commented 3 months ago

Found the real problem with Screamer Rally crashing... It's the setting 3dfx Voodoo Emulation > 12MB memory in Core Options.

12MB is the default setting and it makes the game crash, but i've never seen this happen before with any other game. SR is totally stable when i change this setting to 4MB memory.

Maybe it would be best to make 4MB the default setting? I could be wrong, but didn't all Voodoo cards only have 4MB, so would 12MB even do anything with games, like would it actually be recognised? I know that 3DMark 99 on Windows 98 will recognise 12MB... kind of. It says there's 4MB of video memory when it's set to 12MB, and then says there's only 2MB of video memory when it's set to 4MB. Strange.

Edit: Found this interesting video with a Voodoo 8MB mod. Increased video memory allows some games to run at 800x600 (and i just confirmed this with Motorhead) and it also improves performance a little in some cases, but at 13:39 he mentions there's reports of some games not working with the increased memory. Maybe add this info to the description underneath the 3dfx Voodoo Emulation setting?

I have a partial explanation to why 3dMark only saw 2mb of memory. I believe with these old voodoo cards the memory is split between frame buffer and texture memory. 3dmark probably assumes 4mb total and therefore 2mb texture memory. PCem lets you set these individually when emulating a Voodoo 1.

I wonder in dosbox-pure the 12mb option is not half and half?

schellingb commented 3 months ago

These are our 2 Voodoo memory configurations: 12 MB: 4 MB frame buffer, two texture mapping units with 4 MB each 4 MB: 2 MB frame buffer, one texture mapping unit with 2 MB

Having a larger frame buffer allows higher resolutions and having two texture mapping units allows rendering and blending two textures. Both desired features I'm sure. If just a very small number of games don't work with the 12 MB setting I don't think there's much we can do. I don't know on the top of my head how many games make use of the two texture mapping units but I can look into that. I'd guess on Windows 9x it's quite a few?

PoloniumRain commented 3 months ago

I don't know on the top of my head how many games make use of the two texture mapping units but I can look into that. I'd guess on Windows 9x it's quite a few?

I think it will be loads. Any Unreal Engine based games (like Deus Ex) will be some of them. So on second thought it's best to keep 12MB as the default setting :)

@michael-leroy Btw i tried Diablo 2 with 4MB and it doesn't even launch the game lol, just gives a Glide error.

michael-leroy commented 3 months ago

I don't know on the top of my head how many games make use of the two texture mapping units but I can look into that. I'd guess on Windows 9x it's quite a few?

I think it will be loads. Any Unreal Engine based games (like Deus Ex) will be some of them. So on second thought it's best to keep 12MB as the default setting :)

@michael-leroy Btw i tried Diablo 2 with 4MB and it doesn't even launch the game lol, just gives a Glide error.

Yeah I tried that as well. The game for sure needs > 4mb video memory to work. I think D2's issue is with how textures are handled. Perhaps D2 uses a different texture caching "mode" that is not yet implemented? But I do hope some fixes done in this area accidentally makes games like D2 work.

I can only speculate because I have begun to look at voodoo.cpp but its a lot to take in and understand. Is there a way to debug this and know what glide API calls a game is calling?

PoloniumRain commented 3 months ago

@schellingb There's a cycles problem with a recent commit, maybe this one, where the CPU cycles can get stuck at 3000 when using the default AUTO option for Emulated Performance.

To recreate this issue:

  1. Use the eXoDOS version of Battle Arena Toshinden (i've only tested this game so i know this works).
  2. Go to the games folder and create a new batch file in the same directory as the run.bat batch file.
  3. Paste this inside the new batch file....
    cd tsd
    cycles 60000
    call tsd3dfx
    cd ..
  4. Run the above .BAT file with DBP.
  5. Press F1 to bring up the Quick Menu.
  6. Hold down the SHIFT key and click on Restart.
  7. On the Start Menu, select the .BAT file you just created, and set it to Skip showing first 10 frames (or any frame number).
  8. Press Enter to run the .BAT file and the cycles will now stay at just 3000, so the game runs extremely slow. DBP completely ignores the cycles 60000 command in the batch file. Changing the Emulated Performance option also wont do anything. The cycles will remain at 3000. However, if you delete cycles 60000 from the batch file then cycles will now work correctly on AUTO or any other setting. It also correctly uses 60000 cycles if you do not use Skip showing first xx frames.

Btw, commit ff213d5 is very useful and works well so far. I was just about to ask for this feature, it's like you read my mind lol.

@michael-leroy I can't help with debugging API calls, but recently this happened...

https://github.com/user-attachments/assets/733b9723-1c78-4772-b939-8c76deecdb3a

Everything is being displayed with 3dfx, it's just the colours that are wrong. The weird thing is i didn't change any settings and it just happened to look like this. When i closed the core and run it again D2 went back to looking exactly how it does in your video.

In #316 i mentioned how texture filtering (especially trilinear) + mipmaps don't work correctly, they have visual issues, but disabling either or both of those features will get textures to display correctly. I don't think D2 has any ability to disable either though, but i highly doubt it uses mipmaps anyway. And with other games, when i use the new OpenGL option it will fix these texture issues, but OpenGL doesn't help D2 at all, so maybe it isn't filtering/mipmap related. But i still think it's texture related in some way.

michael-leroy commented 3 months ago

@schellingb There's a cycles problem with a recent commit, maybe this one, where the CPU cycles can get stuck at 3000 when using the default AUTO option for Emulated Performance.

To recreate this issue:

  1. Use the eXoDOS version of Battle Arena Toshinden (i've only tested this game so i know this works).
  2. Go to the games folder and create a new batch file in the same directory as the run.bat batch file.
  3. Paste this inside the new batch file....
cd tsd
cycles 60000
call tsd3dfx
cd ..
  1. Run the above .BAT file with DBP.
    1. Press F1 to bring up the Quick Menu.
    2. Hold down the SHIFT key and click on Restart.
    3. On the Start Menu, select the .BAT file you just created, and set it to Skip showing first 10 frames (or any frame number).
    4. Press Enter to run the .BAT file and the cycles will now stay at just 3000, so the game runs extremely slow. DBP completely ignores the cycles 60000 command in the batch file. Changing the Emulated Performance option also wont do anything. The cycles will remain at 3000. However, if you delete cycles 60000 from the batch file then cycles will now work correctly on AUTO or any other setting. It also correctly uses 60000 cycles if you do not use Skip showing first xx frames.

Btw, commit ff213d5 is very useful and works well so far. I was just about to ask for this feature, it's like you read my mind lol.

@michael-leroy I can't help with debugging API calls, but recently this happened...

Diablo2.mp4

Everything is being displayed with 3dfx, it's just the colours that are wrong. The weird thing is i didn't change any settings and it just happened to look like this. When i closed the core and run it again D2 went back to looking exactly how it does in your video.

In #316 i mentioned how texture filtering (especially trilinear) + mipmaps don't work correctly, they have visual issues, but disabling either or both of those features will get textures to display correctly. I don't think D2 has any ability to disable either though, but i highly doubt it uses mipmaps anyway. And with other games, when i use the new OpenGL option it will fix these texture issues, but OpenGL doesn't help D2 at all, so maybe it isn't filtering/mipmap related. But i still think it's texture related in some way.

Oh man D2 is so close to working. I do agree it’s a problem with the textures or perhaps the timing of stuff being drawn. For example perhaps the time it sorta worked you had a lower/higher cpu or voodoo cycle count? Because I have seen the graphical corruption or white boxes but also at times no sprites/textures other than the background being displayed. I sorta want to try messing with cpu cycles and turning it off from auto to see if the corruption changes…

schellingb commented 3 months ago

Hmm :-) DBP-Diablo2-3dfx

So... our 3dfx core option offers two configurations:

So with the 12 MB config I get the broken textures. With the 4 MB config I get an error at startup (I assume not enough memory). But... I cooked up a 8 MB config (4 MB frame buffer, 1 texture unit with 4 MB) and look at that. For now I'll add that as a choice for the core option. Maybe with some more testing we can find out which config offers the most compatibility, it might not be the current one (12 MB).

michael-leroy commented 3 months ago

Hmm :-) DBP-Diablo2-3dfx

So... our 3dfx core option offers two configurations:

  • 12 MB: 4 MB frame buffer, 2 texture units 4 MB each
  • 4 MB: 2 MB frame buffer, 1 texture unit with 2 MB

So with the 12 MB config I get the broken textures. With the 4 MB config I get an error at startup (I assume not enough memory). But... I cooked up a 8 MB config (4 MB frame buffer, 1 texture unit with 4 MB) and look at that. For now I'll add that as a choice for the core option. Maybe with some more testing we can find out which config offers the most compatibility, it might not be the current one (12 MB).

Wow Amazing! Good idea for the 8mb setting. So perhaps this means the original issue came down to texture memory allocation or reads? Perhaps the textures were split between the two texture units (and their memory) in a way that lead to the corruption? Perhaps due to the way D2 streams in texture data?

With HW OpenGL support enabled, does it even make sense to simulate two texture units each with their own memory banks? I am completely naive about how all this works, but I wonder if you get better game compatibility with the 8mb mode. I wonder if the games/drivers can handle it having one texture unit with 8mb of texture memory...

Your recent additions are really exciting! Great work again. Now if this work here can trickle down eventually to the iPadOS/retro arch version and this bug gets fixed on the RA side for mouse grab, (https://github.com/libretro/RetroArch/issues/15483) I could play D2 on my iPad pro which would be really novel. Really appreciate you both for looking into this!

Some really cool games to see work would be Unreal (crashes at startup. appears to be a crash due to physics bugs. maybe cpu emulation caused?). A UT99 bot match would be amazing as well, but perhaps outside the scope of this project.

EDIT: Unreal Gold works just fine with HW GL and the 8mb and 12mb settings when building from master. I only get crashes on iPadOS using the latest release bundled with RA.

PoloniumRain commented 3 months ago

Nice one @schellingb! You always figure something out :) Can confirm Diablo 2 now works for me as well.

I tried some other games and didn't run into any issues, and being as the 8MB setting still uses a 4MB frame buffer, the same games can still use 800x600, just like with 12MB. The only difference i've found is a very slight change in the frames rates depending on the game or map/area. 12MB is usually faster but barely.

QUAKE (DOS source port with 3dfx support). Settings: 800x600 with OpenGL Timedemo 8MB frames per-second 12MB frames per-second
Demo1 148 152
Demo2 159 162
Demo3 135 138
QUAKE II - (DOS source port with 3dfx support). Settings: 640x480 with OpenGL Timedemo 8MB frames per-second 12MB frames per-second
Demo1 100 98
Demo2 97 97
Descent II - (DOS). Settings: 800x600 with OpenGL Map 8MB frames per-second 12MB frames per-second
Ahayweh Gate 694 700

Each benchmark was ran several times and the frame rate was then averaged out being as there's always some minor variations on each run. So i'm pretty confident there's some small performance differences between 8MB and 12MB, but it's basically nothing (at least with these games).

3D Mark 99 in Windows 98. also gets practically the same score with 8MB and 12MB.

More importantly i've already found another game where 8MB fixes the exact same texture issue that Diable 2 had...

Deus Ex with 12MB and OpenGL:

https://github.com/user-attachments/assets/0ce3686c-dd78-4d8c-a203-994f356f9ead

Deus Ex with 8MB and OpenGL:

https://github.com/user-attachments/assets/fcf7c244-d547-4a0d-97e9-d7cb687a9291

But one difference to Diablo 2 is that using the cores Software rendering option will get Deus Ex working with 12MB. It only doesn't work with 12MB + OpenGL. Diablo 2 never worked with both Software rendering and OpenGL.

It's also like the Descent II issue that you mostly fixed with this commit in the sense that if there's something that updates/covers the messed up looking textures, it can sometimes fix or improve the problem. In this case pressing F1 to bring up the RA Quick Menu will make the games main menu screen display correctly.

If 12MB can't be fixed then so far it's looking like 8MB might be better off as the default setting. Need to test more but i need some sleep!

michael-leroy commented 3 months ago

I can confirm as well that Diablo 2 seems to run great with the latest master build. For Unreal Gold, I was incorrect, it runs just fine in Windows, including with HW GL on. My crash on game startup is only on iPadOS.