I initially added them on the unit tests, but due to the matrix testing and the vast amount of unit tests (and subtests), it yields about 2k tests per run, which is going to eat up the quota very quickly (we've got 6M in total / year).
With that in mind, I think it might be better to start with just integration and e2e, which are the ones that are usually important to monitor. Not that the unit ones aren't, but they're naturally more stable.
I'll reach out on Slack about this to get some early feedback and possibly find some help for the integration ones if they're deem worthy of being tracked as well.
I initially added them on the unit tests, but due to the matrix testing and the vast amount of unit tests (and subtests), it yields about 2k tests per run, which is going to eat up the quota very quickly (we've got 6M in total / year).
With that in mind, I think it might be better to start with just integration and e2e, which are the ones that are usually important to monitor. Not that the unit ones aren't, but they're naturally more stable.
I didn't plug the integration ones, they fail because the required mocha-multi-reporter module can't find the buildkite test analytics reporter (see https://github.com/sourcegraph/cody/actions/runs/10215434987/job/28264855049#step:8:54).
I'll reach out on Slack about this to get some early feedback and possibly find some help for the integration ones if they're deem worthy of being tracked as well.
Test plan
See https://buildkite.com/organizations/sourcegraph/analytics/suites/cody/runs/6af3d32c-7f7c-8b76-83ee-4a325422d5a7#slowest
Changelog