Open FelixRichter2000 opened 3 weeks ago
This isn't a fair test because you're releasing the bodies based on step
, which is a variable callback. Timing fluctuations can differ massively between steps (15.6ms to 18ms on the test I just ran), so no two runs will ever be the same. If you want to test determinism, then the bodies need to be created in perfect sync, every single time.
This isn't a fair test because you're releasing the bodies based on
step
, which is a variable callback. Timing fluctuations can differ massively between steps (15.6ms to 18ms on the test I just ran), so no two runs will ever be the same. If you want to test determinism, then the bodies need to be created in perfect sync, every single time.
I looked through the testbed code and understand that by default it runs at 60Hz, which can be changed using testbed.hz = 42;. Therefore, the step size should be stable by default and result in deterministic simulations.
I also assume that it does not matter at what time I add the new dynamic body between two world.step calls, as long as the step size remains constant between steps.
Please correct me if my assumptions are incorrect.
Additionally, I do not fully understand what you mean by "bodies need to be created in perfect sync, every single time." Could you please demonstrate how to achieve this in my sample code?
Thank you for your assistance.
The step size isn't stable in the testbed. You can confirm this for yourself by adding a var that stores performance.now()
, then update it within step and log the difference. If it was stable, you'd get the exact same difference every single log. When I tried it earlier, it fluctuated significantly (+- 4ms), which meant the bodies were not being released at the same time per run.
If you run it locally (no rendering, no testbed, purely from the console) where the step is perfectly in time, then you could check the body positions to see if it's truly deterministic or not.
From my observation testbed.step
get's called every frame, but the simulation steps are stable (by default the step size is 1/60). You can change the hz to something else like testbed.hz = 10;
which makes the simulation steps much bigger, but testbed.step
still will get called every frame regardless.
So I basically don't care when testbed.step
get's called. Since it get's called once every frame it can spawns multiple bodies before the next simulation step, but regardless the simulation should be deterministic.
I came across that issue in my project (without testbed) where I always use a fixed step size of 1/60 and I am getting exactly the same non-deterministic results as with the testbed.
The simulation steps are stable, but the number of steps per frame is not: https://github.com/piqnt/planck.js/blob/master/testbed/StageTestbed.ts#L489-L494
The test releases a body per frame step, not per simulation step. So sometimes the world may have advanced by several steps before the body is released, sometimes none, and sometimes it may have only stepped once:
Therefore, the body release rate is not consistent. Try releasing them against the world step, instead of the frame step. Something like:
world.on('pre-step', () => {
if(cnt++ < 100)
{
createPlayer();
}
});
It's a very hard test to visually gauge, but by changing this I'm at least seeing much more consistent results from multiple runs. Previously, you'd get a varying number of balls fly off out of the right of the shape, but under pre-step it's always exactly 4.
Thanks, the pre-step function is really useful for these tests. Now we are getting somewhere.
I did some more test runs with this slightly modified testbed that uses the pre-step function. DEMO.
I also recorded what I am seeing, since I know on other hardware it might look different.
https://github.com/piqnt/planck.js/assets/35802356/557323fb-cf02-423e-8ae6-30678d42beaa
Here is what I observed:
In my usecase I need a physics engine, that can simulate these balls that don't collide with each other fully Independent and deterministic.
The only solution that I currently see to go forward with this library is to use a seperate world for each ball and I don't like that solution.
I attempted using individual worlds for each player and eventually achieved deterministic results. However, I noticed inconsistencies even when simulating only one player per world multiple times.
Steps:
Repeating this process produced different results between the first run and subsequent runs. I believe this is due to collision detection optimization (caching).
Solution:
To ensure consistent results, create a new world for each simulation of each player.
Summary:
Using a fresh world for each simulation ensures deterministic behavior.
Thanks for investigating and reporting with all details. I do not recommend reusing world object in general, however I wonder if the root cause has other impacts. I will keep this open until we find more time to investigate.
While using Planck.js, we observed non-deterministic behavior in a scenario involving multiple dynamic bodies. Despite Planck.js claiming to be deterministic, our sample script demonstrates inconsistent results in repeated runs.
Steps to Reproduce:
Demo Link: You can view a live demonstration of this issue at the following link: Demo Website
Expected Behavior: The simulation should produce consistent results in each run, with the dynamic bodies following the same trajectories and collisions occurring at the same points.
Actual Behavior: The simulation exhibits varying behaviors in different runs, with dynamic bodies following different paths and collisions occurring at different times, indicating non-deterministic behavior.
Additional Context: The observed non-deterministic behavior raises concerns about the reliability of Planck.js for simulations requiring precise and repeatable outcomes. We request further investigation into this issue to identify and address the underlying cause.