SciML / LinearSolve.jl

LinearSolve.jl: High-Performance Unified Interface for Linear Solvers in Julia. Easily switch between factorization and Krylov methods, add preconditioners, and all in one interface.
https://docs.sciml.ai/LinearSolve/stable/
Other
248 stars 53 forks source link

Use `Preferences` to toggle precompilation of code that requires GPL dependencies #238

Closed DilumAluthge closed 1 year ago

DilumAluthge commented 1 year ago

Fixes #224

We use Preferences. The user can toggle the behavior by specifying the include_sparse preference. If the user does not specify the preference, we default to the value of Base.USE_GPL_LIBS.

codecov[bot] commented 1 year ago

Codecov Report

Merging #238 (fb6fd69) into main (0857619) will decrease coverage by 0.18%. The diff coverage is 62.50%.

@@            Coverage Diff             @@
##             main     #238      +/-   ##
==========================================
- Coverage   65.04%   64.86%   -0.19%     
==========================================
  Files          11       12       +1     
  Lines         701      703       +2     
==========================================
  Hits          456      456              
- Misses        245      247       +2     
Impacted Files Coverage Δ
src/LinearSolve.jl 75.00% <ø> (ø)
src/factorization.jl 78.43% <ø> (-0.21%) :arrow_down:
src/default.jl 46.47% <50.00%> (-1.35%) :arrow_down:
src/factorization_sparse.jl 100.00% <100.00%> (ø)

:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more

andreasnoack commented 1 year ago

Ref https://github.com/SciML/LinearSolve.jl/issues/224

ChrisRackauckas commented 1 year ago

What about KLUFactorization and UMFPACKFactorization? Won't those need to be avoided? Then won't the default algorithm choice need to pick some other algorithm for sparse? What algorithm would it choose, does Julia have a fallback here?

DilumAluthge commented 1 year ago

In the case where Julia was built with USE_GPL_LIBS=0, I think it's fine for us to say that this package doesn't support any functionality on sparse arrays.

rayegun commented 1 year ago

Nothing in SparseArrays for sure.

Only factorizations are GPLd though?

ChrisRackauckas commented 1 year ago

So then what do we do with the default algorithm choice?

DilumAluthge commented 1 year ago

We could check if LinearSolve.INCLUDE_SPARSE is false, and if it is false, if A is a sparse matrix, then defaultalg(A, ...) throws an error saying that sparse matrices are not supported?

ChrisRackauckas commented 1 year ago

But then how do we make that dispatch on A?

DilumAluthge commented 1 year ago

The SparseMatrixCSC type still exists in Julia even if Base.USE_GPL_LIBS is false. So we can still dispatch on it.

ChrisRackauckas commented 1 year ago

And does a generic fallback implementation exist on \?

ChrisRackauckas commented 1 year ago

@Wimmerer weren't you going to do a pure Julia KLU sometime?

ChrisRackauckas commented 1 year ago

I guess we can default it to KrylovJL_GMRES

j-fu commented 1 year ago

And does a generic fallback implementation exist on \?

https://github.com/PetrKryslUCSD/Sparspak.jl

DilumAluthge commented 1 year ago

And does a generic fallback implementation exist on \?

Yeah:

julia> Base.USE_GPL_LIBS
false

julia> using SuiteSparse, SparseArrays, Random, LinearAlgebra

julia> A = sprand(4, 4, 0.3) + I
4×4 SparseMatrixCSC{Float64, Int64} with 6 stored entries:
 1.24017    ⋅    ⋅         ⋅
  ⋅        1.0   ⋅         ⋅
 0.984142   ⋅   1.0        ⋅
  ⋅         ⋅   0.824643  1.0

julia> b = rand(4)
4-element Vector{Float64}:
 0.9655062442597853
 0.15175107617153183
 0.004248022435501819
 0.4659030670094594

julia> typeof(A)
SparseMatrixCSC{Float64, Int64}

julia> typeof(b)
Vector{Float64} (alias for Array{Float64, 1})

julia> A\b
4-element Vector{Float64}:
  0.7785278622862858
  0.15175107617153183
 -0.7619340985421428
  1.0942264516593845
rayegun commented 1 year ago

It still has to be LGPL, but yeah I'll get it out the door this week @ChrisRackauckas. It just defaults to the dense case as far as I can tell. Once glue deps arrive I can add stuff like Sparspak to LinearSolve? Or I can just go ahead and do that as a direct dep.

DilumAluthge commented 1 year ago

I guess we can default it to KrylovJL_GMRES

Alright, I just pushed b576e4032ef147db454a21123c875e8dfc8537e6 - if INCLUDE_SPARSE is false and A is a SparseMatrixCSC, we default to KrylovJL_GMRES().

ChrisRackauckas commented 1 year ago

Once glue deps arrive I can add stuff like Sparspak to LinearSolve? Or I can just go ahead and do that as a direct dep.

I think it's fine to add it as a direct dep

j-fu commented 1 year ago

Once glue deps arrive I can add stuff like Sparspak to LinearSolve? Or I can just go ahead and do that as a direct dep.

I think it's fine to add it as a direct dep

@PetrKryslUCSD