This repository hosts the main benchmarking tools & data for Arturo itself.
The main scripts are supposed to run automatically @ 21:00 UTC, on a daily basis (only if there are new commits to the main repo following the latest benchmarks), after re-building Arturo's master branch from scratch in release mode on a fresh-spawn/vanilla DigitalOcean droplet (c-4) with the following specifications:
The main benchmarking tool orchestrating the whole process is Hyperfine - which is admittedly a... hyper-fine fit for this type of job.
All the results will be stored here (in the /results
folder):
The collected data will - soon - be available from within Arturo's main website (pretty much in the fashion of V lang - only looking a bit better, I hope... :))
Although the main idea is to run the relevant scripts automatically, via a Cron job on our main server, the benchmarks can be triggered manually.
With hyperfine and Arturo installed (and globally available in the $PATH), and the two repos (this one and the main Arturo repo) side-by-side (that is: under the exact same parent folder), all we have to do is enter this folder (/benchmarks
) and run:
./run.sh <NUMBER_OF_RUNS_PER_BENCHMARK> (*optional)