3dfxdev / EDGE

EDGE Source Code
http://edge2.sf.net
74 stars 10 forks source link

Very slow performance and heavy stuttering on Linux #47

Closed gameblabla closed 6 years ago

gameblabla commented 7 years ago

Hello, Since a year ago or so, the Linux version was not running smoothly. Even when the fps counter is at 59 FPS, it still stutters like mad. 3DGE was compiled with both extreme optimisation "-O3 -march=native -mtune=native" and the default ones. No matter what you're playing, the engine stutters a lot, even though the fps counter says the game runs fine. This was on a Ryzen 5 1500X 3.5 Ghz (SMT disabled) with a Geforce 780.

On an older laptop with a Pentium Dual Core 2 Ghz and an Intel GM45 chipset, it runs even worse. It used to run okay in 2015/early 2016 until interpolation was added. Even with interpolation disabled, it still runs like crap. At a high resolution (720p), it freezes after playing the first map for 2/3 seconds. I had to lower the resolution to 640x480 to make it run without any weird freezes and even then, it was skipping a lot of frames.

I also tried to disable post-rendering effects in the code and it help a bit but couldn't remove the stutter. Do you suffer from stuttering issues on Windows ? Seems like not. Is this caused by a poor frame-skipping code ? It skipped frames on my older laptop so there must be something like that.

I thought that maybe that was an issue with system libraries but i recompiled an older version with the current system libraries and it ran fine.

Corbachu commented 7 years ago

@gameblabla - Sorry to reply so late.

I've recently made some slight changes to the interpolation code, so please try again and see if you get some performance increases. As pointed out by Graf, E_Display was calling N_SetInterpolater() when it didn't really need to -- thusly, I moved it into P_Tick(), and so far on my system, it seems to result in less "wasted" frames.

Since then, UsernameAK has made changes to the CMAKE system to support Linux. _I also recommend you switch to the "arrangedsource" branch, as that one has the most changes so far. It hasn't hit "main" yet because I'm waiting to see if you Linux users can compile with the source files/directories changed up slightly.

So, please try that out first, then the changed code, and let me know if you notice any performance increases =)

usernameak commented 7 years ago

@Corbachu i have the same problems :D

usernameak commented 7 years ago

As noted in 3DGE-ISSUES.txt, -norenderbuffers is required on Linux with ATI. But i have Intel HD Graphics O_o

Corbachu commented 6 years ago

Intel HD graphics...generally EDGE will run like crap on it. Any OpenGL-heavy engine on IntelHD isn't expected to perform well most of the time. I dont have one so I can't say for sure. The desktop machine though is beefy enough..strange...

usernameak commented 6 years ago

@Corbachu hmmm... Doom 3 gives 60 fps without glitches on my IntelHD on maximum graphics quality (I know a guy who has ATI and it gives 15 fps on it, but the specs are almost the same) but EDGE runs like crap on it. So... i can say that EDGE renderer is shit I have i3-4150, HD Graphics 4400 and 6GB RAM

OrdinaryMagician commented 6 years ago

i5-6400 and a GTX 680 here, using the proprietary NVIDIA drivers, of course.

I also do notice some stuttering despite it running constantly at 60 FPS.

usernameak commented 6 years ago

I just have walked through source code with GDB, and I've found a problem: https://github.com/3dfxdev/hyper3DGE/blob/3fdb5f080b7517ebd3473d2dad6cee94ed2ee989/src/r_shaderprogram.cc#L145 gives GL_INVALID_VALUE error

Corbachu commented 6 years ago

Is that not supported under Linux video drivers? I am thinking and have heard Linux drivers are finicky, but if you have suggestions then I'm all ears. Maybe defaulting to -norenderbuffers on Linux would help?

OrdinaryMagician commented 6 years ago

From the OpenGL docs: GL_INVALID_VALUE is generated if colorNumber is greater than or equal to GL_MAX_DRAW_BUFFERS.

I've tested that the value of GL_MAX_DRAW_BUFFERS is 8 in my system.

Corbachu commented 6 years ago

@usernameak well it's not that its a shitty renderer, it just wasn't designed for interpolation. We really gotta take a look at the lerp code and fix it up. I think most of it is bottlenecking in E_Tick() in e_main.cc, but a lot of the code resides in n_network.cc.

usernameak commented 6 years ago

@Corbachu the reason of such bug is that NVIDIA linux driver has less limits about glBlitFramebuffer than other linux drivers

Corbachu commented 6 years ago

okay, so what we need to do is (if you can), wrap an #ifdef LINUX around that and have it check that the extension limit is in a valid range. You can insert that via R_Main.cc, in RGL_CheckExtensions(), so if it fails that test we can disable the shaderprogram much earlier in the initialisation. That's the best solution IMO. :-)

After you handle that, then we can begin fixing up the interpolation code. EDGE does this backwards (literally), while other DOOM ports (notably GZD) do forward prediction. so that's something we all need to discuss to prevent framerate issues.

usernameak commented 6 years ago

@Corbachu I remembered that in the Linux Intel driver has very bad shader preprocessor that almost doesn't work. So... we have to write our own one! And yes, add that fucking gitinfo.h to .gitignore

Corbachu commented 6 years ago

I thought gitinfo was added! I remember adding it before...lol I'll do it when I get off work tonight :D

usernameak commented 6 years ago

@Corbachu everything becomes smooth when i do r_lerp 0

Corbachu commented 6 years ago

yeah, I figured that was the cause. Rachael had me create a lerp2 branch so we can focus on rewriting all of the interpolation code :D

Corbachu commented 6 years ago

@usernameak @gameblabla

Please build and try this again -- I have made numerous changes to the swap interval code that should have stuttering and slowdowns fixed xD

Please test on Linux so we can compile and release a 2.1.0-Test3 binary for Linux folk =)

Corbachu commented 6 years ago

@gameblabla Can this be revisited by some Linux folk? I had made important changes that fixed the SDL refresh timing for the screen, so now there's hardly any stutters unless you are playing a giant map like Frozen Time.

Please look into this again so I can close it. =)

Corbachu commented 6 years ago

@dsdman can you confirm or deny any of these slowdown issues for me? Still on the last steps of setting up my VM but I would like real-OS performance tested. I really appreciate the help! Let me know if you can or can't. Thanks friend.

dsdman commented 6 years ago

@Corbachu I can't confirm nor deny any of these slowdowns, mainly because I still do not have a display in-game (no menus, no hud, no graphics, no anything). I plan on opening a ticket later when I have time. That being said, I turned on debug_fps in the config file, and the fps does not show up in the log, so I'm assuming this option displays/draws the fps on screen. Also on a side-note, I don't think there would be any slowdowns, as even Doom 2016 plays at 60 fps (anywhere below 900p) on this machine.

Corbachu commented 6 years ago

@dsdman restart the compile with debug and developers enabled in the code. Does EDGE generate the debug textfile? If so, please paste the entire contents or pop it via PasteBin. Please also paste the additional text files (glsl.log and glext.log) as those will help me determine if it could be a glEnumError or whatnot. I have a sneaking suspicion it has to do with the video code (I_video.cc) not setting the swap interval right, but at least with all 3 of those logs I can see what might be happening.

Corbachu commented 6 years ago

@dsdman also run the engine with -norenderbuffers just in case!

dsdman commented 6 years ago

@Corbachu Okay, here it is:

Debug.txt: glsl.log: EDGE2.log

glext.log and edgegl.txt are both empty, but below is a list of all the ifdefs I could find using grep. Which one enables output to this file? ifdefs.txt

OrdinaryMagician commented 6 years ago

Also have the same black screen problem. Here's my logs:

debug.txt EDGE2.log glsl.log

Nothing from glext.log, and I have no edgegl.txt file.

Corbachu commented 6 years ago

@dsdman There should only be "GLSL.log" and "glext.log". Edgegl.txt turned into a logfile in the newer commits.

Try setting r_swapinterval to 0 in edge.cfg and reboot. Setting it to " 2" is only for Win32, so if it is already at 1, then set it to 0..

Corbachu commented 6 years ago

Also, run the engine with -oldglchecks just in case. .

dsdman commented 6 years ago

@Corbachu ran it with both r_swapinterval at 0 and 1, still no graphics. However, setting it to 0 also seems to crash it (no sounds/input with it at 0 in-game. It still has sounds/input in the menu that you can't see. At 1 it has sounds/input even in-game). Here are the updated logs (with it set to 0): edge2.log debug.txt glsl.log

Corbachu commented 6 years ago

Hmmm. I'm not sure what the issue can be at this point. You tried -oldglchecks right? Also try turning off r_bloom, r_fxaa, and r_lens.

dsdman commented 6 years ago

@Corbachu Yes, both -oldglchecks and -norenderbuffers are on. I just finished trying permutations of these two options along with changing r_swapinterval, r_bloom, r_fxaa, and r_lens in the config file. The only difference between any of these options seems to be that r_swapinterval at 0 crashes the game.

Corbachu commented 6 years ago

Hmmm, okay so r_swapbuffers needs a "r" flag to prevent users from changing it (breaking the game).

I started looking into OpenGL debuggers last night but haven't settled on one. Since Win32's renderer works, we would need to settle on one for Linux only.

I know Something Somewhere is calling a glEnumError; it isn't SDL as the context appears to be created and dumped just fine. So I propose we start looking for a solution as soon as possible. What do you guys think?

Corbachu commented 6 years ago

@dsdman Do a pull. I have enforced much stricter checks for the RenderBuffers context. Now, if -norenderbuffers is specified, it will prevent RGL_InitRenderBuffers() from even creating the context. Avoid using -oldglchecks unless absolutely necessary. If this doesn't fix the Linux rendering context, I'm moving on to hard OpenGL debugging.

dsdman commented 6 years ago

@Corbachu Just did a pull + recompile. There is a compile error (oddly enough when linking the executable):

CMakeFiles/3DGE.dir/src/r_main.cc.o: In function RGL_Init()': r_main.cc:(.text+0x1cc6): undefined reference tor_fxaa_quality' I "fixed" this by commenting out line 760 in r_main.cc: r_fxaa_quality.d = 0

Still gives me a black screen (both with and without -norenderbuffers.) I guess we'll have to debug it through OpenGL debugging tools, though I recall it not being broken several months ago, so it may be a regression with one of the past changes to the renderer. I'll do some testing to see if I can find the most recent commit that doesn't have this problem.

Corbachu commented 6 years ago

@dsdman keep me up to date, if/when you find one, let me know how to set it up since I have Ubuntu in a VM now :-)

Hope we can get this ironed out!

Corbachu commented 6 years ago

And the r_fxaa_quality was introduced but I forgot to push it...I'll do that tonight :D

gameblabla commented 6 years ago

So after applying my pull request (https://github.com/3dfxdev/hyper3DGE/pull/58) and disabling r_fxaa_quality, 3DGE works again on Linux. I first tried it with -norenderbuffers, r_lerp disabled and the game would stutter. I then tried it again without it and enabled all the options (including r_lerp at 1) and the game was pretty smooth ! Still some very little stutter.

However, here's when things become odd. (I'm using the Nouveau driver btw) I set my GPU to max perf and without -norenderbuffers and now the game would stutter like mad ! I then set my GPU to its lowest clock and the game ran pretty smooth again.

What is going on ? lol. Seems like your code is still not quite right. 3DGE used to work ok before interpolation was added to the codebase and since then, it behaved like that.

madame-rachelle commented 6 years ago

@gameblabla can you please test master to make sure that the fixes I put work? I rewrote the swapinterval code to be somewhat OS-agnostic and... well, personally, easier to understand.

@Corbachu I don't know if I got the new r_swapinterval values or triggers right, to be quite honest I don't even know if it should trigger based on r_vsync values or not. My "fix" is more about getting things working right now, than getting things correct. You'll have to correct them if they're wrong. Sorry. >.<

Corbachu commented 6 years ago

@dsdman could you also please test and make sure @raa-eruanna 's fixes have indeed restored the Linux renderer? Thanks :)

dsdman commented 6 years ago

@Corbachu Still won't render on my end. Another thing interesting is that if I revert to commit 19a7657 and apply @gameblabla (now closed) linux fix pull request, it still won't render on my end. So maybe this is an issue with the proprietary Linux Nvidia driver. I will see if I can use the Nouveau (open source) driver and try again with that.

gameblabla commented 6 years ago

@raa-eruanna Sadly, it does not fix it for me. I compiled the latest git and it still happens. It's hard to describe but it is basically not in sync with the game world, like i can see previous frames. However, its a little better now with r_lerp disabled. Still does not feel quite right, like vsync is not turned on. (even though vsync is set to 1)

EDIT: I compared it with an older version of EDGE and it certainly does not feel as smooth as the older version.

madame-rachelle commented 6 years ago

Manipulating the swap interval values goes a bit beyond my OpenGL understanding then, unfortunately. If r_vsync is 0 it's supposed to turn it off completely and there's a statement earlier in the same file that does exactly that - I haven't checked if it's ever even executed, though.

gameblabla commented 6 years ago

I just switched to the proprietary Nvidia drivers and it works properly with it too. In fact, this does not have the sync issues Nouveau has with it. (when using Interpolation mode) I thought it was maybe due to the lack of automatic reclocking but even when set to performance mode, it works fine. Kinda sad because i didn't want to use the proprietary drivers. Now i need to try this on Intel chips.

EDIT: So i tried it on my laptop and while it does not exhibit the issues i had on nouveau, it is pretty slow, with or without interpolation. I also tried 3DGE on Weston (a wayland compositor) with Nouveau and it's actually smoother than on X11, it plays with little stutter. EDIT2: I also just realized the stuttering issues also happens with the old version... Well looks like it's not going to be easy to fix :/

dsdman commented 6 years ago

I just finished testing after resolving #57. It runs butter smooth on my end, even with r_lerp set to 1.

Corbachu commented 6 years ago

Great! Performance issues are eliminated then. I'll go ahead and close it =)

gameblabla commented 6 years ago

I'm sorry to say this but they're not. It's still very slow on Intel machines with their iGPUs. It's definitely not smooth at all on them.

Corbachu commented 6 years ago

Noticed this issue was reopened. Some of you are saying it's smooth, others are not. No more opening this issue -- just for reference. For continuity and cleanliness' sake, can we please post an all new issue? @dsdman sorry my friend! Thead is just too 'noisy' for my ADHD to follow ;-)