Open chinmaygarde opened 1 year ago
The way SkiaPerf uploads from Engine CI currently works is the following:
In https://github.com/flutter/engine/blob/main/ci/builders/standalone/linux_benchmarks.json, we do a Linux host_release build, then run the scripts flutter/testing/benchmark/generate_metrics.sh
followed by flutter/testing/benchmark/upload_metrics.sh
.
flutter/testing/benchmark/upload_metrics.sh
works by invoking testing/benchmark/bin/parse_and_send.dart
, which accepts a .json
file populated with the benchmark info/results.
CI populates environment variables so that parse_and_send.dart
knows how to upload data to SkiaPerf due to the metric_center_token
entry in the context
in linux_benchmarks.json
.
To upload malioc results, we probably need a new script that accepts malioc.json
and filters/translates it to something that can then be passed to benchmark/bin/parse_and_send.dart
. Then that script would be called from the "tests"
entry of one of the build configs under ci/builders
, probably just linux_benchmarks.json
maybe to do all or most of the SkiaPerf uploading from the same place.
Today, we have to read the JSON file directly to determine the performance impact (as statically determined) of changes to Impeller shaders.
We should track this on a dashboard of some sort to spot regression in a more user friendly manner. I am not too sure if Skia Perf is setup for this but that would be the logical spot to visualize such stats.