Open fuchsto opened 8 years ago
During the Christmas days I did some research and came to the decision, that we definitely should not implement this our own. There are (at least two) pretty advanced benchmarking libraries for C++:
Both offer a highly configurable but easy to use interface similar to GoogleTest. While GoogleBenchmark might be easier to use, Celero is better fitted to our quite complex input parameters. Furthermore it suites well to automated performance regression testing. A comparison of these two (along with others) libraries can be found here.
I know, switching to one of the libraries will require quite some refactoring, but IMO it's worth the time. Maybe this could also be implemented as part of a bachelor thesis?
@dash-project/developers Please tell me your opinion about my proposal.
@fmoessbauer Apart from that: you, my dear, are officially forbidden to direct your attention to mundane tasks like this. You have more interesting stuff to invest your efforts into.
... the same applies to @devreal for the same reason. Seriously: if you do, I won't merge your pull requests or roll them back if someone else does. Leave CI alone. It's the evil witch. It drains your energy.
For automated performance reports in CI, benchmark applications should provide an option to print performance measurements in a unified format.
For this, utility types/concepts
dash::bench::BenchmarkResult
,dash::bench::BenchmarkResultPrinter
should be introduced to decouple benchmark parameters and performance metrics from output.Illustrating example:
... with
123.4
,226.6
,559.0
as measures in a "unified" performance metric (higher is better) specific to the benchmark. The overall goal is to emit warnings if these values decrease in a nightly build.