Closed bp74 closed 11 years ago
Hi Bernhard,
Thanks for the report. I agree I think the output should be changed to be more explicit. Do you have any specific recommendations on what you'd like to see outputted?
Thanks, John
I'm not sure, here are just some random ideas...
It would be interesting to see and get the values after the run:
you could add setters for:
Maybe it's not necessary to override warmup and exercise?
Follow up ideas:
The "report" method could take a "BenchmarkConfig" object and return a "BenchmarkReport" object. The config could hold the warmup and exercise configuration and this config could be used in multiple benchmark.report calls. This could be useful if you run many benchmarks but you setup the config just once.
Hi Bernhard,
I've updated the benchmark_harness library to print total time, number of runs, and average run time.
I don't have the time right now to implement your (excellent!) suggestions.
John
Hi Bernhard,
The latest change has been reverted because we need to keep the benchmark harness compatible with internal benchmarks. I've expanded the documentation explaining how to interpret and compare the results.
John
Hello
I've used the benchmark_harness for the first time and i got confused. The example shows that i should overwrite the "run" method with my benchmark code. In the end the run time is reported and it does not show the average run time but 10x the average run time of the "run" method. The reason is that "exercise" calls "run" 10 times and in fact the run time of "exercise" is reported.
Maybe you could make it more clear in the example. Or change run/warmup/exercise completely to make it more intuitive.
Thank you, Bernhard