Closed DilumAluthge closed 1 year ago
Merging #238 (fb6fd69) into main (0857619) will decrease coverage by
0.18%
. The diff coverage is62.50%
.
@@ Coverage Diff @@
## main #238 +/- ##
==========================================
- Coverage 65.04% 64.86% -0.19%
==========================================
Files 11 12 +1
Lines 701 703 +2
==========================================
Hits 456 456
- Misses 245 247 +2
Impacted Files | Coverage Δ | |
---|---|---|
src/LinearSolve.jl | 75.00% <ø> (ø) |
|
src/factorization.jl | 78.43% <ø> (-0.21%) |
:arrow_down: |
src/default.jl | 46.47% <50.00%> (-1.35%) |
:arrow_down: |
src/factorization_sparse.jl | 100.00% <100.00%> (ø) |
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
What about KLUFactorization
and UMFPACKFactorization
? Won't those need to be avoided? Then won't the default algorithm choice need to pick some other algorithm for sparse? What algorithm would it choose, does Julia have a fallback here?
In the case where Julia was built with USE_GPL_LIBS=0
, I think it's fine for us to say that this package doesn't support any functionality on sparse arrays.
Nothing in SparseArrays for sure.
Only factorizations are GPLd though?
So then what do we do with the default algorithm choice?
We could check if LinearSolve.INCLUDE_SPARSE
is false
, and if it is false
, if A
is a sparse matrix, then defaultalg(A, ...)
throws an error saying that sparse matrices are not supported?
But then how do we make that dispatch on A?
The SparseMatrixCSC
type still exists in Julia even if Base.USE_GPL_LIBS
is false. So we can still dispatch on it.
And does a generic fallback implementation exist on \
?
@Wimmerer weren't you going to do a pure Julia KLU sometime?
I guess we can default it to KrylovJL_GMRES
And does a generic fallback implementation exist on
\
?
And does a generic fallback implementation exist on
\
?
Yeah:
julia> Base.USE_GPL_LIBS
false
julia> using SuiteSparse, SparseArrays, Random, LinearAlgebra
julia> A = sprand(4, 4, 0.3) + I
4×4 SparseMatrixCSC{Float64, Int64} with 6 stored entries:
1.24017 ⋅ ⋅ ⋅
⋅ 1.0 ⋅ ⋅
0.984142 ⋅ 1.0 ⋅
⋅ ⋅ 0.824643 1.0
julia> b = rand(4)
4-element Vector{Float64}:
0.9655062442597853
0.15175107617153183
0.004248022435501819
0.4659030670094594
julia> typeof(A)
SparseMatrixCSC{Float64, Int64}
julia> typeof(b)
Vector{Float64} (alias for Array{Float64, 1})
julia> A\b
4-element Vector{Float64}:
0.7785278622862858
0.15175107617153183
-0.7619340985421428
1.0942264516593845
It still has to be LGPL, but yeah I'll get it out the door this week @ChrisRackauckas. It just defaults to the dense case as far as I can tell. Once glue deps arrive I can add stuff like Sparspak to LinearSolve? Or I can just go ahead and do that as a direct dep.
I guess we can default it to KrylovJL_GMRES
Alright, I just pushed b576e4032ef147db454a21123c875e8dfc8537e6 - if INCLUDE_SPARSE
is false
and A
is a SparseMatrixCSC
, we default to KrylovJL_GMRES()
.
Once glue deps arrive I can add stuff like Sparspak to LinearSolve? Or I can just go ahead and do that as a direct dep.
I think it's fine to add it as a direct dep
Once glue deps arrive I can add stuff like Sparspak to LinearSolve? Or I can just go ahead and do that as a direct dep.
I think it's fine to add it as a direct dep
@PetrKryslUCSD
Fixes #224
We use
Preferences
. The user can toggle the behavior by specifying theinclude_sparse
preference. If the user does not specify the preference, we default to the value ofBase.USE_GPL_LIBS
.