Closed doorjamb closed 7 months ago
yeah, looks like a bug, will fix
:tada: This issue has been resolved in version 1.15.7 :tada:
The release is available on:
Your semantic-release bot :package::rocket:
Thanks, much appreciated!!
More minor, but in the latest version with this fix I can see that there's an issue where the merge step will now compute and saves all updated durations with 11 decimal point values unnecessarily:
{
"spec": "cypress/integration/B.ts",
"duration": 100.98374627384
}
and it seems to slowly increment them across separate timing runs. I can't tell whether that could cause issues down the line
Yeah, let me round it Sent from my iPhoneOn Nov 16, 2023, at 15:55, doorjamb @.***> wrote: Thanks, much appreciated!! More minor, but in the latest version with this fix I can see that there's an issue where the merge step will now compute and saves all updated durations with 11 decimal point values unnecessarily: { "spec": "cypress/integration/B.ts", "duration": 100.98374627384 }
and it seems to slowly increment them across separate timing runs. I can't tell whether that could cause issues down the line
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you modified the open/close state.Message ID: @.***>
Hey @bahmutov, big fan of your work! 😀 I'm looking into the leveraging of cypress-split and I believe I may have stumbled across a bit of a logical issue.. unless I'm missing something in here about how parallel job timings are being shared?
TL;DR (title): timings from parallel runs won't get updated whenever the spec ran more quickly than it did on a previous run.
Explanation
Because
cypress-split
saves updated timings to the same file as it reads them from (timings.json), updated timings.json from parallel runs will not have updated spec duration times, whenever the most recent run's spec ran slower than the value that was set for it in timings.json.Example:
Starting state:
we have 2 spec in timings.json, A with time = 100s & B with time = 100s
Let's say we run our tests on two parallel runners:
Now, because we had timings already in timings.json, the values in each timings.json would be:
We would now expect (logically) that our timing file after merging results would contain spec A, B each with 85s timings. But, looking at the code in timings.js which gets used when running
cypress-split-merge
:Which, is fair, since it does not have a notion of which result is the most up-to-date one, unless it compares with the original timings.json committed in the repo.
After merging the two timings.json above, the resulting merged timings.json output would consist in: spec A = 100s spec B = 100s
but really, we would want the timings.json to now be: spec A = 85s spec B = 85s
Local Repro:
In /partials-test/1 have this json:
In /partials-test/2 have this json:
run:
Output:
This would have the side effect of the timings.json always having the most pessimistic run times, and whenever a spec starts to run faster (it was optimized, split into more than one file, etc), the entries in timings.json would never be updated to the new (smaller) values.