bahmutov / cypress-split

Split Cypress specs across parallel CI machines for speed
MIT License
212 stars 24 forks source link

👌 Take into consideration scenario when spec has only skipped tests #216

Closed tommy-anderson closed 7 months ago

tommy-anderson commented 8 months ago

I've been using cypress-split for some time now, and I've noticed issues with how tests are divided in our main project, which involves running approximately 800 tests across 17 machines in parallel. Specifically, some machines finish much quicker than others, and there's a significant difference between the estimated test durations provided by Cypress-Split and the actual time these tests take for some machines.

Upon investigating these discrepancies between estimated and actual test durations, I discovered that some of our specs had all of their tests skipped. Let's refer to these as skipped_specs

Here are two important observations:

  1. Cypress detects these skipped_specs and attempts to run them.
  2. The results.status.passed for these specs is 0, meaning they're not counted as passed tests, and that the timings are not saved to the json file.

As a result, when Cypress-Split encounters a skipped_spec, it treats it as a new spec and assigns it the average duration. This leads to a miscalculation in how tests are distributed, resulting in some machines finishing much earlier than others due to the large difference between the duration of an average test run and a skipped test run.

My proposal for the solution is to include the results for the skipped_spec in the timings.json by detecting the specs that had all their tests skipped ( pending ). Having the proper timings will result in the load-balancer working as expected.

tommy-anderson commented 8 months ago

@bahmutov tell me your thoughts on this when you have a chance

github-actions[bot] commented 7 months ago

:tada: This PR is included in version 1.19.1 :tada:

The release is available on:

Your semantic-release bot :package::rocket: