niklas-heer / speed-comparison

A repo which compares the speed of different programming languages.
https://niklas-heer.github.io/speed-comparison
MIT License
475 stars 76 forks source link

Provide "clean" and "real world" results #59

Open niklas-heer opened 1 year ago

niklas-heer commented 1 year ago

As suggested in #51 by @HenrikBengtsson more "clean" data for the calculation of pi could be gathered by measuring the performance of each language with and without calculating pi and then subtracting the one from the other.

[...] I think it would be better if you could find a way to not include the startup times, and the parsing of 'rounds.txt' in the results. A poor man's solution would be to benchmark each language with and without the part that calculates pi and the subtract to get the timings of interest.

I think it would be best to keep both data. "Real world" data with startup and IO, and "clean" data for just calculating pi. I would keep both data in the CSV, but I'm not sure which one to favour for the image creation. Probably the "clean" data 🤔

In terms of implementation. I can see two approaches:

Obviously, both would require adjustments to scbench and the analysis step.

francescoalemanno commented 1 year ago

I believe setting rounds to 0 should be about equivalent

niklas-heer commented 1 year ago

@francescoalemanno that is a brilliant idea. Would at least make things way easier to implement.

Glavo commented 1 year ago

@niklas-heer

I reimplement the benchmark for C++, Java, Golang, Python and JavaScript: https://github.com/Glavo/leibniz-benchmark

I run twenty rounds of benchmarking and count the average time spent on the last ten rounds. Here is the result I got:

Glavo commented 1 year ago

I think the "clean" result is the one that can reflect the real world situation.

The main factor affecting the results now is the startup and loading time, not the real performance of the language.