17cupsofcoffee / tetra

🎮 A simple 2D game framework written in Rust
MIT License
910 stars 63 forks source link

Minimizing the window jumps up the CPU usage. #213

Closed sumibi-yakitori closed 3 years ago

sumibi-yakitori commented 3 years ago

Summary: Minimizing the window jumps up the CPU usage.

Steps to reproduce: Run an empty application and minimize its window.

Additional info: https://github.com/17cupsofcoffee/tetra/blob/62c9a264c8bd35585515e4e43375d5f835f4598f/src/context.rs#L155 Perhaps this problem occurs because std::thread::yield_now() does not wait properly when the application is idle.

This issue is confirmed to occur on Windows and MacOS.

17cupsofcoffee commented 3 years ago

Hm, I can't replicate this on my machine - I see no noticable difference in CPU usage on any of my cores when I minimize the animation example. I also have a AMD Ryzen 5 processor (although mine is a 6 core rather than a 4 core).

That's not to say that there's not an issue (the screenshots you've provided say otherwise!) but it's a tricky one to diagnose...

sumibi-yakitori commented 3 years ago

As a test, I cloned the master branch of tetra into a new empty folder on my Windows PC and ran the cargo run --release --example animation in both the stable and nightly Rust toolchains, but this problem occurred.

sumibi-yakitori commented 3 years ago

Since the above Windows PC and my Mac both have AMD Radeon GPUs, I ran animation on another Surface Pro. This machine has an Intel HD Graphics 620.

Even in this case, minimizing the window seems to increase the CPU usage.

17cupsofcoffee commented 3 years ago

Still can't replicate locally, but someone on Discord was able to get similar behaviour in GGEZ with yield_now, so I think that may indeed be the culprit.

Does replacing that line with std::thread::sleep(std::time::Duration::from_millis(1)) fix the issue for you?

If so, I'll need to investigate the implications of making that change, but I'd be open to doing it, at least as a short-term fix. My main worry was that the resolution of the timing wouldn't be high enough (it defaults to a minimum of 15ms on Windows), but apparently that's overridden by SDL2 these days.

sumibi-yakitori commented 3 years ago

Applying your fix will improve it somewhat.

2.4% -> 14% (on MacOS)

Since I think the essence of the problem is that the number of times the tick method is called increases when the window is minimized, I'm going to use If you intend to introduce a temporary workaround, how about the following code? (At least if vsync is enabled)

std::thread::sleep(
  Duration::from_secs_f32(1.0 / TARGET_FPS)
    .saturating_sub(tetra::time::get_delta_time(ctx)),
);

I'm not familiar with the structure of tetra, so there may be something wrong with this code, though.

17cupsofcoffee commented 3 years ago

The reason I suggested the sleep(1ms) is because it's simple and I'm fairly confident that it works 😅 Love2D does it by default, seemingly with no ill effects.

I'd be open to adding smarter frame limiting (IIRC MonoGame does something kinda similar to your suggested fix), I just need to make sure I understand how it impacts the various configurations of the game loop (e.g. is it going to make the fixed updates start acting weird).

Even if the SDL overrides the minimum update frequency, is the interval 1ms... The update interval may be insufficient in my code.

1ms is the minimum resolution for std::thread::sleep, at least on Windows. You have to switch to higher resolution timing APIs to be any more precise than that, which makes things a lot more complicated, and none of the game engines I've looked so far bother with it. I assume there's a good reason for that, but I don't know what it is 😆

sumibi-yakitori commented 3 years ago

Thanks for everything. For now, I've decided to inject the sleep code into the draw method of my application, so I may not need to have it urgently addressed until I find the best solution to this problem.

As for not being able to reproduce the problem in your environment. Have you made any changes in your GPU's global settings that might affect the refresh rate? (G-SYNC, etc.) Even if so, I don't know if this is something that will affect the window minimization...

I would also like to test it on an NVIDIA GPU machine when I have time, but I don't have a high-end NVIDIA GPU.

17cupsofcoffee commented 3 years ago

Ah, I do in fact have an NVIDIA GPU, so that might be the difference. I don't think I've changed any settings that would affect this, but I'll dig into that when I get some time.

17cupsofcoffee commented 3 years ago

I've opted to use the sleep(1ms) approach for now, as that seems to improve things a lot without much risk of breakage. Once that change is released, I will create a new issue (or repurpose this one) for potentially adding smarter behaviour.

17cupsofcoffee commented 3 years ago

Released the easy fix in 0.5.6, and have opened #218 for any future discussion of game loop timing changes. Thank you for the bug report!