benchmark-action / github-action-benchmark

GitHub Action for continuous benchmarking to keep performance
https://benchmark-action.github.io/github-action-benchmark/dev/bench/
MIT License
1.01k stars 152 forks source link

Crossing boundry between units in a single graph #15

Open jasonwilliams opened 4 years ago

jasonwilliams commented 4 years ago

Hey @rhysd , great work on this. Is anything supposed to happen if some data crosses the boundry between say nanoseconds and microseconds?

In my benchmark here that has happened in the bottom two: https://jasonwilliams.github.io/boa/dev/bench/

It looks like its become a lot slower, but actually it sped up from 1 us to 900 ns. Unfortunately once the chart is made with a unit im guessing its fixed.

think i answered my own question https://github.com/rhysd/github-action-benchmark/blob/master/src/default_index_html.ts#L198

jasonwilliams commented 4 years ago

I have an idea, which could be to use math.js to convert what ever future values back to the right unit, math.eval(901.86 ns to us) => 0.90186 us.

This could be broken down into 2 tasks:

  1. get the most common unit for that series
  2. use math.js to convert current value + unit to that unit
jasonwilliams commented 4 years ago

This is a bit messy right now but ive got it working here: https://github.com/jasonwilliams/github-action-benchmark/blob/local-add-criterion-support/src/default_index_html.ts

Example of it working here (on the bottom graph highlight the third node it should be in ns): https://jasonwilliams.github.io/boa/dev/bench/

rhysd commented 4 years ago

Thank you for the work. Let me take a look after Criterion.rs support has landed.

wwerkk commented 6 months ago

Any news on this?

Bench tools such as Catch2 can output value and range using different units, at the moment only the unit coming with the value is stored and displayed, which results in storage and display errors.

See Catch2 log:

Screenshot 2024-04-15 at 12 35 29

Corresponding entry in data.js file:

Screenshot 2024-04-15 at 12 35 39

This obviously displays as mean value of 9.24651 ms with deviation of ~556 ms.

wwerkk commented 6 months ago

Normalising the units when filling the datasets, ie. to seconds, also does the trick, unless you care that much about being as accurate as possible.

I have an idea, which could be to use math.js to convert what ever future values back to the right unit, math.eval(901.86 ns to us) => 0.90186 us.

This could be broken down into 2 tasks:

1. get the most common unit for that series

2. use math.js to convert current value + unit to that unit