Closed JNechaevsky closed 6 years ago
Okay, we are in uncapped mode and minimizing game window. CPU usage increasing greatly, let's see what is happening...
Variable screenvisible
deactivated here:
https://github.com/JNechaevsky/russian-doom/blob/master/src/i_video.c#L345
case SDL_WINDOWEVENT_MINIMIZED:
screenvisible = false;
break;
...and then, it's leading here: https://github.com/JNechaevsky/russian-doom/blob/master/src/doom/d_main.c#L547
// Update display, next frame, with current state.
if (screenvisible)
D_Display ();
Apparently, having no D_Display
call in while
loop is making this loop ruining extremly fast as possible, and thus, CPU usage increased greatly. Simplest approach is to don't use screenvisible
at all, it is fixing this bug, but perhaps there can be more smart way to go.
Brad, in Doom Retro problem seems to be with another thing, I can't say for sure what exactly causing a problem there. :( Maybe you should make some additions for SDL_WINDOWEVENT_MINIMIZED
state?
Does this problem occur with uncapped framerate with wither vsync on and off?
the problem seems to be that D_Display()
is the function which calls I_FinishUpdate()
which in turn is the function which calls SDL_RenderPresent()
. And only the latter function is the one that performs vsync, i.e. the one that waits for the next screen refresh.
Your approach is to keep the CPU reasonably active even if the window is minimized by letting it still render the scene which is, however, never presented to the user.
I have two different alternative ideas for the (!screenvisible)
case.
Either, we explicitly tell the CPU that it might take some free time by calling it to sleep for 1 ms:
--- a/src/doom/d_main.c
+++ b/src/doom/d_main.c
@@ -558,6 +558,8 @@ void D_DoomLoop (void)
// Update display, next frame, with current state.
if (screenvisible)
D_Display ();
+ else
+ I_Sleep(1);
}
}
Or we just fall back to capped rendering at 35fps in that case, which means putting the CPU to sleep until the next 1/35 sec cycle has passed:
--- a/src/d_loop.c
+++ b/src/d_loop.c
@@ -728,7 +728,7 @@ void TryRunTics (void)
// [AM] If we've uncapped the framerate and there are no tics
// to run, return early instead of waiting around.
- if (counts == 0 && crispy->uncapped && gametic)
+ if (counts == 0 && crispy->uncapped && gametic && screenvisible)
return;
if (counts < 1)
I think I like the second approach better.
You are cutting me apart by bare hands by asking to enable vsync. :anguished:
Fortunately, no, it's happening with both vsync enabled and disabled. I have tried both approaches, both of them are working fine, but second one, apparently is better - this way CPU usage is more often equals 00-01%. Sorry for such a funny description, but I don't have a tool for proper CPU usage checking, all I can do is to observe usage in Windows Task Manager.
Thanks for clarifications! Let's use second approach then. :blush:
I still don't quite get it. Your screen shows 60 pictures per second. Yet, you cannot stand the fact that the game content is updated 60 times per second, you insists that it is updated twice as often, although you will never ever actually see the the extra calculated content. He?!
Wrong tracker. :stuck_out_tongue:
But there is a difference! I don't know how to explain it, it's barely notable, but it's exist. It feels like... Frames are changing slightly soften in the difference of 60 and 120.
RD maybe not a good example for noticing, so I'm highly recommend you to see this difference in GZDoom, it's friendly with any wide screen resolution. Just try to disable stinking vsync, then run GZDoom. Initially, fps cap is 200 fps, can you can lower it to 60 by console command vid_maxfps
. Then, take a very, VERY close look, and you'll see the difference.
Another notable thing is - in 60 fps with disabled vsync, I can see some barely notable "arcs" while camera rotation, but it is not happening in 120. Enabled vsync seems to be fixing this, though.
This all can't the fruit of my inflamed imagination. I hope so! :sweat:
I start to understand that I'm wrong in some aspects... I trust my eyes, but this is not an argument. And is really necessary to be done - implement vid_vsync
as a variable. It should be something like this:
if (vid_vsync)
renderer_flags |= SDL_RENDERER_PRESENTVSYNC;
else
renderer_flags &= ~SDL_RENDERER_PRESENTVSYNC;
It's seems to be brought by uncapped framerate and only on Windows OS, but not happening in Heretic and Hexen. I think I know why... Give me a couple of minutes to think.
Kindly inviting @fabiangreffrath and @bradharding, since it is also happening in both Crispy and Retro.