Prozi / detect-collisions

detecting collisions between bodies: Points, Lines, Boxes, Polygons (Concave too), Ellipses and Circles. Also RayCasting. All bodies can have offset, rotation, scale, bounding box padding, can be static (non moving) or be trigger bodies (non colliding).
https://prozi.github.io/detect-collisions/
MIT License
191 stars 21 forks source link

feature-request: Extend Benchmarking #37

Open BobGneu opened 1 year ago

BobGneu commented 1 year ago

Does this project have specific goals around performance that are maintained or monitored over time?

Something like what Deno does, where performance would be tracked and maintained over time.

Recently I have found this github actions based tool that seems to be in the right vein, though its tied to gh-pages.

I am definitely down put together a PR for it if there is interest. I will need some input into what would be of the most benefit to benchmark though.

Prozi commented 1 year ago

hello thanks for suggestions

i will try to connect https://github.com/Prozi/detect-collisions#benchmark with some changes with what you linked

Prozi commented 1 year ago

I updated the benchmark to be more deterministic and

used it inside the circle ci build process https://app.circleci.com/pipelines/github/Prozi/detect-collisions

example:

https://app.circleci.com/pipelines/github/Prozi/detect-collisions/156/workflows/61b5a6fd-9947-4443-9517-f4a28251cd05/jobs/130

image

Prozi commented 1 year ago

I tried reading about what you pasted but had no success after first try - it seems so complicated

have you tried using this tool maybe and can provide some help if needed?

BobGneu commented 1 year ago

I have, and it only works well with github pages.

The trick is that the build agent is only there to

  1. Pull down the previous run
  2. Append the latest
  3. Truncate the results to the window of results you are interested in
  4. Commit & push the results back into the gh-pages branch

In looking at the stress script I think the better route would be to define some boundaries/cases and then build out the suites to support them:

  1. Insert 100 bodies, Non Overlapping
  2. Insert 100 bodies, Overlapping
  3. Update 100 bodies, Non Overlapping
  4. Update 100 bodies, Overlapping
  5. Remove 100 bodies, Non Overlapping
  6. Remove 100 bodies, Overlapping

Repeat the above for each shape type, then mixed

I think I have some time tomorrow and may be able to get you a PR to help set the direction for this, assuming my comments above align with the vision for the library. I have sync'd up my fork and will try to get you a PR later in the afternoon, assuming time permits.

Prozi commented 1 year ago

What are the expectations for the ray casts?

TBH the goal was they should just work, on all types of bodies, which can be seen in tank demo https://prozi.github.io/detect-collisions/demo/

What is the upper bound for the system and how many entities it should be supporting collisions with?

I would say based on benchmark that a rough 1500 constantly moving bodies to keep 60 fps updates

Are you open to leveraging something like tinybench to help offload the processing/statistical side of this?

yes

I think I have some time tomorrow and may be able to get you a PR to help set the direction for this, assuming my comments above align with the vision for the library. I have sync'd up my fork and will try to get you a PR later in the afternoon, assuming time permits.

I would love a merge request with such changes

BobGneu commented 1 year ago

Alrighty!

I am going to take a stab at it in the morrow. Started nosing around already and I think my schedule this week is open enough to get this moving.

BobGneu commented 1 year ago

Now that we have the baseline in here, is there a segment you would like to focus our efforts on?

Prozi commented 1 year ago

Now that we have the baseline in here, is there a segment you would like to focus our efforts on?

hello

I think the speed of testing for

those are from the top of my head that could use a proper benchmark

Prozi commented 1 year ago

also thinking of:

  1. moving from circleci to github workflows altogether
  2. merging the stress test benchmark (npm run benchmark) into the workflow (separate workflow? benchmark workflow?)
  3. running tests in separate job as a github workflow

what are your opinions?

Prozi commented 1 year ago

maybe divide the benchmarks into long running and fast running, put both on different workflows/pipelines

and add the fast running to pre commit hook

??

those insertion benchmarks and collision ones are fast ones I guess and the stress benchmark (which can be improved) is long running because it has 10x 1000ms scenarios (we can make them shorter but even then it will be 10 x N ms and results for FPS measuring for ms < 1000 are not precise)

bfelbo commented 2 months ago

This library looks awesome. Appreciate the streamlined focus on doing one thing and doing it well.

Have you done any benchmark comparisons between this library and more general physics libraries like Rapier, Jolt, matter.js, Planck.js?

Prozi commented 1 month ago

This library looks awesome. Appreciate the streamlined focus on doing one thing and doing it well.

Have you done any benchmark comparisons between this library and more general physics libraries like Rapier, Jolt, matter.js, Planck.js?

no but basically if they don't use webasm I doubt something what implements THE SAME THING + physics would be any faster than my library

Prozi commented 1 month ago

also I have more features than some because of:

@bfelbo thanks