Open dmehra opened 8 years ago
Maybe build matrix in travis.yml
could be used to archieve this - the 'columns' being node versions, and 'rows' juttle master
and juttle x.y.z
.
Ideally, the tests against master should serve as a 'red flag' to the developer that the incompatibilities w/ master should be fixed, but not necessarily in this particular PR.
From this perspective, I think that the build:
What does "soft fail" mean to you? If Travis reports green, your incentive to read its report is very low (as it should be, not to waste dev time).
I see the path of reporting red if the test against master failed, but allowing the merge to proceed with an explanation note on the PR.
The alternative is to not test against master in Travis but leave it to the developer (locally, having changed the version dependency).
@demmer thoughts?
@dmehra IIUC, travis can distinguish between 'failing a build' and 'failing a single test suite in the matrix'. The tests would still be red but the build would pass - which I call 'soft fail' (I assume that the badge would be red in that case - which needs verification). The feature is described here
Interesting, i didn't know about this feature. This post has a screenshot of what it will look like in Travis. Keep in mind, however, that the github badge on the PR will be green, and it's only if you click on the details to proceed to the travis page, you'll see the allowed failures there. So it still leaves the developer in the somewhat odd situation of needing to triage their green badge every time to see if it's "all green". Don't know if travis would email you about the failures in this case.
@rlgomes have you tried using allow_failures
travis feature?
nope but I have yet to find in all my career a situation where "green that is really red" or "yellow instead of red" leads to anyone actually caring about it... so lets just live with RED if it happens to be red and do something about it at that point.
@rlgomes On the other hand, perpetual redness creates blindness and “real” redness can end up being ignored because of that.
If failed tests in allow_failures
send failure e-mails like normal tests do, I’d say that would be enough to alert the owner (which is the point of the whole exercise).
Lets let red be red and green be green and when there's a problem with actual tests failing for reasons other than there's something to fix we can look closely and why this is happening and change the way we're testing rather than the meaning of passing vs failing.
(This is a juttle-engine issue because it's cross-repo)
Set up a way in travis to run unit tests of each adapter against the latest release and latest master of juttle. The goal is to notice when adapter and juttle have diverged.
npm cache clean
? at the expense of slowing down travis)