Open Khhs167 opened 1 year ago
Tagging subscribers to this area: @mangod9 See info in area-owners.md if you want to be subscribed.
Author: | Khhs167 |
---|---|
Assignees: | - |
Labels: | `area-System.Threading`, `untriaged` |
Milestone: | - |
So, how could we solve this?
There are many reasons why the time between frames could spike and appear higher than normal and they aren't limited to just the debugger. They can also appear due to other expensive programs running in the background, unexpected latency/lag, etc.
A fairly standard way to solve this more generally is to provide a cap for your maximum delta time. Unity provides this via https://docs.unity3d.com/ScriptReference/Time-maximumDeltaTime.html. Xna provided a similar feature, as does the mini engine used by the official DirectX samples repo: https://github.com/microsoft/DirectX-Graphics-Samples/blob/master/Samples/Desktop/D3D12Multithreading/src/StepTimer.h#L92-L96 if you want some MIT licensed reference.
Providing such functionality can help handle any kind of larger than expected delta times and ultimately make your app more robust.
The standard practice is to just never let the timer go above a certain threshold. That's why demanding games running on an under-powered computer will appear to move in slow motion, even in modern game engines. If your game requires high-precision or deterministic physics the delta should be fixed at an exact framerate (often the refresh rate of the display). If variable deltas are acceptable you should cap it (usually with something at or higher than the display's refresh rate).
Glitches related to this have always been around. Such as rapidly pausing and unpausing in racing games to manipulate the acceleration. Tricks to gain more enemy hits are also used by manipulating refresh rates during certain battles.
You, nor .NET, will be able to mark timestamps of when threads have context switches as that's managed by the OS and your code, by definition, can't execute anything with the needed precision to track those events. Anything you implement will need to be at the application level.
Hello!
I am currently working on a game in C#.
In this game, I(like most games) need to measure frame delta. Issue is: there's no way for me to reliably do that.
My method of measuring frame delta is via a stopwatch being reset at the start of a frame, and then I get the elapsed milliseconds, to calculate how long last frame took, which works good enough.
But, whenever I hit a breakpoint, or I pause the program, this stopwatch keeps on going, and going, and going, making the frame delta huge, and causing lag, or in the worst case scenario: a crash due to the amount of stuff the game needs to catch up on.
So, how could we solve this?
My idea is that we add a timer thread to .net that keeps track of program-relative time. It pauses whenever the debugger pauses, and resumes whenever the debugger resumes. This way, timers(and other program components) could sync to this timer instead, hence pausing the timers whenever the program itself pauses.
The main issue with this solution is that programs that use stopwatches/some other timer function to reliably measure real-world time could break, as they expect these timers to not pause on debugging.
The solution is however very simple: Either a compiler flag, or a property on the stopwatches etc(I'm thinking something like,
RealTime
), or both. In the case of both, the compiler flag could set the default value of the aforementionedRealTime
property.Hope I put this in the right place!