SuperTux / supertux

SuperTux source code
https://supertux.org
GNU General Public License v3.0
2.47k stars 472 forks source link

Add option for frame prediction (= less jitter at high frame rates) #2962

Closed mstoeckl closed 3 weeks ago

mstoeckl commented 1 month ago

This PR adds an option which makes SuperTux render more smoothly on high frame rate displays.

(The option only updates the camera transform and Tux display position at the display frame rate; everything else continues to update at 66.66 Hz. If you look carefully at some moving enemies/objects, you will still be able to see jitter (definition) and/or blur. With quite a bit of work, one can update all object draw() calls to project object positions forward in time, which would mostly resolve object jittering, at the cost of much less noticeable stutter (definition). If there is interest, I could work on implementing extrapolation (definition) for other objects, but at the moment this PR works well enough for me. I can only perceive a difference with this option on when setting my monitor to e.g. 120 or 240Hz; at 60Hz I don't notice a difference, but that could just be me.)

Naming is hard, and suggestions for better names for this feature are welcome. "Camera smoothing"? "Render above logic"? "Frame prediction?"

Related: https://github.com/SuperTux/supertux/issues/909, (maybe) https://github.com/SuperTux/supertux/issues/910

mstoeckl commented 1 month ago

Please do not merge this yet: I just noticed it causes flickering issues in the level editor. (And there's a compile warning I'll fix soon.)

bruhmoent commented 1 month ago

I can confirm.

weluvgoatz commented 1 month ago

I can confirm.

Then why did you approve it...

bruhmoent commented 1 month ago

Since I have a 144hz monitor I wanted to confirm the feature works wonders - to confirm the main functionality .

mstoeckl commented 1 month ago

Please do not merge this yet: I just noticed it causes flickering issues in the level editor. (And there's a compile warning I'll fix soon.)

I've updated this PR to try to fix both of these; I also refactored to change the feature name and avoid introducing a new global variable.

mstoeckl commented 1 month ago

I just don't know if my setup is good enough for me to notice any difference.

If you don't have a high frame rate monitor, one way to make the change more visible is to artificially degrade the game logic rate i.e.,; in src/supertux/constants.hpp, set LOGICAL_FPS = 20.0f. Then find a flat level and run straight for a while; when the camera moves, the fixed level tiles should appear to move in a much less jerky fashion when this option is enabled. On a 60Hz monitor, with SuperTux fullscreen, I can see the difference even on the title screen.

mstoeckl commented 1 month ago

I'm sorry, but I'm occasionally seeing huge skipping when playing with the change enabled (screen will jump back and force violently). I'll try to capture them on video.

Does this happen on any particular level?

mstoeckl commented 1 month ago

I'm sorry, but I'm occasionally seeing huge skipping when playing with the change enabled (screen will jump back and force violently). I'll try to capture them on video.

I've updated this PR to fix an issue where the rendered frame skips by up to 8-game-steps of change when the game loop is falling behind the render loop / there is lag. (In ScreenManager::loop_iter, I wasn't taking into account the fact that elapsed_ticks could be larger than ms_per_step.) This probably fixes the issue that you are referring to, but there could of course be multiple bugs.

tobbi commented 1 month ago

Thanks for working on this further. I tried reproducing the issue but was rather unsuccessful. I'll give this another look in a few (currently working on a few cppcheck fixes for this game).

mstoeckl commented 1 month ago

I tried reproducing the issue but was rather unsuccessful.

It's only luck that I noticed it -- I happened to use a computer where I think the GPU powersaving is messed up, which causes noticeable lag for a short period. The issue I noticed and fixed could be reliably triggered by injecting the following code in loop_iter.

{
  static int lag_wave = 0;
  if (rand() % 400 == 0) {
     lag_wave = 20;
  }
  if (lag_wave > 0) {
     SDL_Delay(50);
     lag_wave--;
  }
}
tobbi commented 1 month ago

I tried reproducing the issue but was rather unsuccessful.

It's only luck that I noticed it -- I happened to use a computer where I think the GPU powersaving is messed up, which causes noticeable lag for a short period. The issue I noticed and fixed could be reliably triggered by injecting the following code in loop_iter.

{
  static int lag_wave = 0;
  if (rand() % 400 == 0) {
     lag_wave = 20;
  }
  if (lag_wave > 0) {
     SDL_Delay(50);
     lag_wave--;
  }
}

I tried this yesterday and this seems to have fixed it

HybridDog commented 4 weeks ago

I have tested it on a 60 Hz display and the movement looks much smoother to me now. For a detailed look on how it jitters, I've recorded SuperTux with simplescreenrecorder and used ffmpeg and a Python3 script to offset each frame in the opposite direction of the motion. Video with disabled setting: https://github.com/SuperTux/supertux/assets/3192173/b5077e9d-c284-43ff-abd5-6c74a5daede6 Video with enabled setting: https://github.com/SuperTux/supertux/assets/3192173/ae411fdb-e7bc-49fe-ad21-fef4bd7d33dc The script: visualize_jitter.py.txt I've encoded the videos with 1/3 of the speed in 20 Hz for an easier visualisation. I recommend watching them with mpv; firefox apparently doesn't support my 20 Hz mp4 videos. The videos show that with disabled setting, SuperTux jitters rhythmically, and with enabled setting, the jittering is weak (small spatial distance) and random.

If I understand the code correctly, the interpolation moves the camera using the elapsed time. The elapsed time can be noisy and it is not the same as the time when the display shows the frame, i.e. it is not aligned to, for example, 60 Hz. As far as I know, ideally, with a fixed refresh rate display, the camera movement should be calculated using the time when the display will show the image to the user. I don't know if its possible to do this without stuttering; this webpage about swapchains and frame pacing contains some information about this topic.

On a sample-and-hold display, there is always motion blur, so in this case I think weak jittering just makes this blur unnoticeably bigger whereas strong jittering, which this PR fixes, is much more noticeable. On the other hand, with a blinking display, such as a CRT, OLED with black frame insertion or LCD with backlight strobing, there is (almost) no motion blur by default and weak jittering may be easier to notice.

mstoeckl commented 4 weeks ago

If I understand the code correctly, the interpolation moves the camera using the elapsed time. The elapsed time can be noisy and it is not the same as the time when the display shows the frame, i.e. it is not aligned to, for example, 60 Hz. As far as I know, ideally, with a fixed refresh rate display, the camera movement should be calculated using the time when the display will show the image to the user.

Yes. I've been experimenting with assuming, when vsync is on, that the time increases by 1/m_fps_statistics.get_fps() instead of the measured wall clock time; while this fixes the elapsed time noise, it introduces its own issues on startup, when the frame rate changes, or when there are lag spikes. I'd prefer to leave that for the future, since I still don't know of a good solution and think this PR is already an improvement at high frame rates (where the elapsed time noise is often smaller.)

(Also: thank you for posting the script, I've been doing a lot of testing by eye.)

HybridDog commented 4 weeks ago

I'd prefer to leave that for the future, since I still don't know of a good solution and think this PR is already an improvement at high frame rates (where the elapsed time noise is often smaller.)

I agree and I also don't know a good solution. The gamma-incorrect bilinear interpolation of texture pixels may also very slightly increase motion blur since the positioning of the texels has an effect on where the image is sharp and where not.

In my opinion, the PR is ready to be merged.

tobbi commented 3 weeks ago

Should this be on by default?

mstoeckl commented 3 weeks ago

Should this be on by default?

I don't have a strong opinion on this -- I chose off by default, because there are probably many very minor issues noticeable with this option that haven't been discovered yet (and some that have, like the timing noise problem). I usually try to keep git in an always-releasable state, but on the other hand, doing so for games isn't as important. Turning the option on by default would make any associated issues more important for 0.7.0.

If I understand the code correctly, the interpolation moves the camera using the elapsed time. The elapsed time can be noisy and it is not the same as the time when the display shows the frame, i.e. it is not aligned to, for example, 60 Hz. As far as I know, ideally, with a fixed refresh rate display, the camera movement should be calculated using the time when the display will show the image to the user.

I'd prefer to leave that for the future, since I still don't know of a good solution and think this PR is already an improvement at high frame rates (where the elapsed time noise is often smaller.)

I agree and I also don't know a good solution.

After reading more about this issue, I think the most correct thing to do would be to get precise predicted frame display times from the operating system. However, a) SDL doesn't yet support this b) there are complications with variable refresh rate monitors, uncertainty, precise definitions, and synchronization that have held up proposals to support this on Linux, like Vulkan's VK_EXT_present_timing, for a few years. Other than hypothetical future presentation timing, I still don't know what the best option is, but will continue to think about it. (Edit: there are a few general improvements to loop_iter that I have in mind, like switching to higher precision time values, and improving the FPS measurement, but I'd like to defer those to future PRs.)

HybridDog commented 3 weeks ago

I don't care if the setting is on or off by default.

Other than hypothetical future presentation timing, I still don't know what the best option is, but will continue to think about it.

I assume that the jittering is caused by noise, so perhaps it is possible to denoise the time measurements to reduce jittering. However, I don't know a good algorithm for this purpose. With a simple low pass filter (e.g. exponential average), not only the noise but also sudden large jumps in the delta time would be smoothed, and the noise would be reduced but not eliminated. Denoising should also have the property that the overall sum of the denoised delta times does not differ from the overall sum of the measured delta times. Something like a bilateral filter could preserve large delta time jumps but I don't know if it has the property that it preserves the overall sum of delta times.