JuliaSmoothOptimizers / MUMPS.jl

A Julia Interface to MUMPS
Other
43 stars 15 forks source link

Test MUMPS_seq_jll 5.5.1 with MKL #108

Closed amontoison closed 9 months ago

codecov[bot] commented 2 years ago

Codecov Report

Base: 19.94% // Head: 20.83% // Increases project coverage by +0.89% :tada:

Coverage data is based on head (2c7e873) compared to base (cce74aa). Patch coverage: 90.00% of modified lines in pull request are covered.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #108 +/- ## ========================================== + Coverage 19.94% 20.83% +0.89% ========================================== Files 6 7 +1 Lines 777 787 +10 ========================================== + Hits 155 164 +9 - Misses 622 623 +1 ``` | [Impacted Files](https://codecov.io/gh/JuliaSmoothOptimizers/MUMPS.jl/pull/108?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaSmoothOptimizers) | Coverage Δ | | |---|---|---| | [src/MUMPS.jl](https://codecov.io/gh/JuliaSmoothOptimizers/MUMPS.jl/pull/108/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaSmoothOptimizers#diff-c3JjL01VTVBTLmps) | `90.00% <90.00%> (ø)` | | Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaSmoothOptimizers). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaSmoothOptimizers)

:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

amontoison commented 2 years ago

We also have the null pointer on Windows if we use MKL_jll instead of OpenBLAS32_jll. The tests also fail on Mac if we use MKL_jll. :(

dpo commented 2 years ago

This failure points to a BLAS mixup. It's the same failure we were having before you fixed the JLL.

amontoison commented 2 years ago

This failure points to a BLAS mixup. It's the same failure we were having before you fixed the JLL.

MKL only works if it's single threaded on Mac... https://github.com/JuliaLinearAlgebra/MKL.jl/issues/111

It should be more relevant to use Apple Accelerate on Apple platforms.

LinearAlgebra.BLAS.lbt_forward("/System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate")

I just don't know if it's a LP64 or ILP64 BLAS.

dpo commented 2 years ago

I don'think it uses 64 bit Ints.

amontoison commented 9 months ago

The issue was fixed in LBT 5.4.0, which is available with Julia >= 1.9.