I have a manually-maintained file of time annotations in yjit-metrics-pages/_include/events.json, with various things that are known to affect the benchmark timing in the graphs. This includes things like switching to a different AWS instance type, changing warmup params, known slowdown bugs in CRuby and others.
That's fine. Not a bad thing to keep track somewhere. But they would be much more useful if added as annotations on the timeline graphs so we could easily visually match them up with benchmark timing changes.
I have a manually-maintained file of time annotations in yjit-metrics-pages/_include/events.json, with various things that are known to affect the benchmark timing in the graphs. This includes things like switching to a different AWS instance type, changing warmup params, known slowdown bugs in CRuby and others.
That's fine. Not a bad thing to keep track somewhere. But they would be much more useful if added as annotations on the timeline graphs so we could easily visually match them up with benchmark timing changes.