boostorg / multiprecision

Boost.Multiprecision
Boost Software License 1.0
197 stars 112 forks source link

Query tiny CI detail float128 specfun tests maybe not running #520

Closed ckormanyos closed 1 year ago

ckormanyos commented 1 year ago

Hi John (@jzmaddock) and Matt (@mborland),

I believe I may have stumbled across an issue worth querying on the main issue list.

While attempting to run specfun tests for a new backend, I found that both my intended specfun tests as well as those for float128 seem like they might not actually be running in CI.

Consider the following build log. Here we see that the specfun tests for float128 go green, but actually a detailed look at the log reveals that no tests whatsoever seem to have been actually compiled and executed. In fact, we only found 1 target in the test phase of that CI run.

I really am a beginner in integrating specfun, but I tend to thnk that these tests are actually not running.

If this is the case, I feel maybe the requirement on the build checks has_float128 (or it's called something like that) is false so no tests run in this particular detailed constellation.

Ummmm.... What do you guys think? And if this suspicion is actually happenning, maybe I can get my new backend specfun running?

Cc: @sinandredemption

ckormanyos commented 1 year ago

So just speculating...

But in other projects, I've handled such false positives by doing crazy stuff like:

That's a long way to go. The other option, check it dilligently and get the build dependencies absolutely right in the crazy VM/compiler combinations confronting the CI.

jzmaddock commented 1 year ago

This is intended: you're running the tests with c++20 rather than gnu++20 and gnu extensions are required for __float128

On Mon, 16 Jan 2023, 11:19 Christopher Kormanyos, @.***> wrote:

So just speculating...

But in other projects, I've handled such false positives by doing crazy stuff like:

  • Pipe the build log to a temporary file.
  • cat the temporary file for build output.
  • tail and then grep or gawk the temporyry build output file to get a unanimous decision on test success.

— Reply to this email directly, view it on GitHub https://github.com/boostorg/multiprecision/issues/520#issuecomment-1383895325, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABGHSOF7ZP644B7OTVHJW5TWSUVFRANCNFSM6AAAAAAT4TDT64 . You are receiving this because you were mentioned.Message ID: @.***>

ckormanyos commented 1 year ago

This is intended: you're running the tests with c++20 rather than gnu++20 and gnu extensions are required for __float128

Thanks John, I was afraid you'd say that. This brings up some follow-up questions.

Would it make sense to make a test-matrix group for gnu++ standards on GHA. I know these exist on some of the ther CI services. But I couldn't find one on GHA.

ckormanyos commented 1 year ago

I guess it'd be possible to add, say, gnu++14, etc. to the test matrix for thos specfuns on GHA? Or is this just silly?

jzmaddock commented 1 year ago

The drone tests are all gnu++XX. Just the way it got split.

Thinking out loud here, we could cat the configuration build log to the end of the output if that would help in future? It's in bin.v2/config.log BTW.

ckormanyos commented 1 year ago

Thinking out loud here, we could cat the configuration build log to the end of the output if that would help in future? It's in bin.v2/config.log BTW.

This is the kind of thing that could help. Or it could turn out to be a stumbling block if it proves to be unreliable.

If we can get these kinds of added checks to be reiliable, I usually do, in fact, do this kind of thing.

Since I am in that area, I will take a look at feasibility/cost-benefit/reliability-prediction.

I usually use a combination of cat, tail, and grep or gawk on *nix. I'll take a look.

I'll also close this issue.

Thanks John! Good advice as always!