Closed ckormanyos closed 1 year ago
So just speculating...
But in other projects, I've handled such false positives by doing crazy stuff like:
cat
the temporary file for build output.tail
the temporary file for build output.grep
or gawk
the result of tail
-ing to get a unanimous decision on test success.That's a long way to go. The other option, check it dilligently and get the build dependencies absolutely right in the crazy VM/compiler combinations confronting the CI.
This is intended: you're running the tests with c++20 rather than gnu++20 and gnu extensions are required for __float128
On Mon, 16 Jan 2023, 11:19 Christopher Kormanyos, @.***> wrote:
So just speculating...
But in other projects, I've handled such false positives by doing crazy stuff like:
- Pipe the build log to a temporary file.
- cat the temporary file for build output.
- tail and then grep or gawk the temporyry build output file to get a unanimous decision on test success.
— Reply to this email directly, view it on GitHub https://github.com/boostorg/multiprecision/issues/520#issuecomment-1383895325, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABGHSOF7ZP644B7OTVHJW5TWSUVFRANCNFSM6AAAAAAT4TDT64 . You are receiving this because you were mentioned.Message ID: @.***>
This is intended: you're running the tests with c++20 rather than gnu++20 and gnu extensions are required for __float128
Thanks John, I was afraid you'd say that. This brings up some follow-up questions.
-std=c++XX
if nothing is actually executed?-std=gnu++XX
? So I can add my tests there.Would it make sense to make a test-matrix group for gnu++
standards on GHA. I know these exist on some of the ther CI services. But I couldn't find one on GHA.
I guess it'd be possible to add, say, gnu++14
, etc. to the test matrix for thos specfun
s on GHA? Or is this just silly?
The drone tests are all gnu++XX. Just the way it got split.
Thinking out loud here, we could cat
the configuration build log to the end of the output if that would help in future? It's in bin.v2/config.log BTW.
Thinking out loud here, we could
cat
the configuration build log to the end of the output if that would help in future? It's in bin.v2/config.log BTW.
This is the kind of thing that could help. Or it could turn out to be a stumbling block if it proves to be unreliable.
If we can get these kinds of added checks to be reiliable, I usually do, in fact, do this kind of thing.
Since I am in that area, I will take a look at feasibility/cost-benefit/reliability-prediction.
I usually use a combination of cat
, tail
, and grep
or gawk
on *nix
. I'll take a look.
I'll also close this issue.
Thanks John! Good advice as always!
Hi John (@jzmaddock) and Matt (@mborland),
I believe I may have stumbled across an issue worth querying on the main issue list.
While attempting to run
specfun
tests for a new backend, I found that both my intendedspecfun
tests as well as those forfloat128
seem like they might not actually be running in CI.Consider the following build log. Here we see that the
specfun
tests forfloat128
go green, but actually a detailed look at the log reveals that no tests whatsoever seem to have been actually compiled and executed. In fact, we onlyfound 1 target
in thetest
phase of that CI run.I really am a beginner in integrating
specfun
, but I tend to thnk that these tests are actually not running.If this is the case, I feel maybe the requirement on the build checks
has_float128
(or it's called something like that) isfalse
so no tests run in this particular detailed constellation.Ummmm.... What do you guys think? And if this suspicion is actually happenning, maybe I can get my new backend
specfun
running?Cc: @sinandredemption