Closed spydon closed 4 months ago
Apparently also sometimes happens for windows: https://github.com/invertase/melos/actions/runs/8374207232/job/22928929951?pr=671
Here's another one: https://github.com/invertase/melos/actions/runs/8420212367/job/23054448154
@jessicatarra maybe you could have a look at this test if you have time: "multiple scripts verifies that a melos script can call another script containing steps, and ensures all commands in those steps are executed successfully"? :) It seems to be the one that is the most flaky.
Sure, I'll take a look
https://github.com/invertase/melos/actions/runs/8434508600/job/23098706712?pr=679 this seems to be the same problem as the one you solved right @jessicatarra, but on another test?
I'm uncertain if this is a flaky test, it seems like a real issue because it's failing across all environments, possibly a missing dependency. I'll take a look at this too
@spydon it's actually a flaky one so disregard my previous comment, it's actually a very similar issue as you mentioned, see the most recent log:
test/workspace_test.dart: Workspace can be accessed from anywhere within a workspace (failed)
Expected: 'a\n'
''
Actual: 'Resolving dependencies...\n'
'Got dependencies!\n'
'a\n'
''
Which: is different.
Expected: a\n
Actual: Resolving ...
^
Differ at offset 0
Is there an existing issue for this?
Version
5.2.1
Description
Some tests are flaky when they run for Linux in the the CI, example run: https://github.com/invertase/melos/actions/runs/8373797082/job/22927655150
Steps to reproduce
Run the CI and ~50% of the time some linux tests will fail.
Expected behavior
Non-flaky tests.
Screenshots
No response
Additional context and comments
No response