krausest / js-framework-benchmark

A comparison of the performance of a few popular javascript frameworks
https://krausest.github.io/js-framework-benchmark/
Apache License 2.0
6.62k stars 820 forks source link

How to track when a benchmark improves? #728

Closed andykais closed 3 years ago

andykais commented 4 years ago

Is there a way to know this information currently:

krausest commented 4 years ago

Sorry, this is currently impossible, since we're measuring a combination of framework version, browser version, os versions with cpu vulnerabilities mitigations and there's no way to find the culprit.

And it seems like only frameworks are getting faster :cry: https://github.com/krausest/js-framework-benchmark/issues/683#issuecomment-616080615 (ok, gnome 3.36 was an improvement, but it can't compensate the effect chrome has).

Please don't take the ranking too literally. Many frameworks are close enough that the order can change between runs. There's a comparison function (in display mode) that tries to let one make statistically safer statements.

andykais commented 4 years ago

fair enough. I do think there is still useful info to glean from these benchmarks. I am specifically interested in the average performance of a library, rather than specific performance on a os or browser. So in that respect, the data is still 1 dimensional. What I imagine is something like this:

[view a specific library across versions]

Viewing Svelte

Version Create Rows performance
3.18 158.41.1(1.33)
3.11 178.41.1(1.0)
... ...

and so on.

That, in addition to a changelog that tracks ranking movements;

Benchmark April 2020

Version changes
- solid 0.17.0
+ solid 0.17.2
- preact 10.0.0
+ preact 10.0.1
Ranking movements
andykais commented 4 years ago

Heck, if the benchmark data was available each time it was ran in the releases section of this repo that would be fantastic in its own right. I could take a stab at building a UI for comparing performance diffs over time

krausest commented 4 years ago

The results are stored in a ts file (I think pretty much from the beginning) https://github.com/krausest/js-framework-benchmark/commits/master/webdriver-ts-results/src/results.ts and since 2019 in a json file https://github.com/krausest/js-framework-benchmark/commits/master?before=06d5f5944a7454a7469ee6b35176c07218d43484+35&path%5B%5D=webdriver-ts&path%5B%5D=results.json Maybe one could relate the results with vanillajs to reduce the influence of the non-framework factors. Looking forward to hearing if that helps!

andykais commented 4 years ago

thick. Is there a way to know when a group of benchmarks were ran? I suppose I could group everything based on the package version number

krausest commented 4 years ago

BTW I'd highly recommend against taking the ranking too literally. Variance is high enough and many frameworks are close enough such that the ranking isn't stable and frameworks might change positions just by running the benchmark again. Please use the "compare results against" display mode to compare implementations which uses a statistical significance test.

krausest commented 3 years ago

I'm closing older issues that won't get fixed.