Introduce the ability to have presubmit perf tests that block checkins based on perf regressions.
This implies that we need to be able to reliably measure performance with reasonable accuracy. It also implies we have a mechanism to easily track baseline performance numbers for a suite of tests.
A low-tech way to track performance baselines would be a text file in the repo that lists the individual perf tests along with their minimum baseline expected results. This is what we've done for code coverage and would seem to be workable here too.
We want something general purpose that's not tied to Mixer so it can be leveraged for other Istio components.
Introduce the ability to have presubmit perf tests that block checkins based on perf regressions.
This implies that we need to be able to reliably measure performance with reasonable accuracy. It also implies we have a mechanism to easily track baseline performance numbers for a suite of tests.
A low-tech way to track performance baselines would be a text file in the repo that lists the individual perf tests along with their minimum baseline expected results. This is what we've done for code coverage and would seem to be workable here too.
We want something general purpose that's not tied to Mixer so it can be leveraged for other Istio components.