SciML / LinearSolve.jl

LinearSolve.jl: High-Performance Unified Interface for Linear Solvers in Julia. Easily switch between factorization and Krylov methods, add preconditioners, and all in one interface.
https://docs.sciml.ai/LinearSolve/stable/
Other
241 stars 52 forks source link

start fixing ambiguities #437

Closed ArnoStrouwen closed 8 months ago

ArnoStrouwen commented 9 months ago

Fixes many of the method ambiguities in this package. Things I'm still unsure about:

First, for defaultalg there is dispatch for AbstractSciMLOperator as well as dispatch for OperatorAssumptions{Nothing}. AbstractSciMLOperator dispatches based on A.A. OperatorAssumptions{Nothing} dispatches based on issquare(A). If both occur together, I don't know which order to apply them. Currently, it first gets rid of the Nothing, but this means issquare has to be defined for Operators.

Second, there are remaining ambiguities regarding:

function init_cacheval(alg::GenericFactorization,
    A::Union{Hermitian{T, <:SparseMatrixCSC},
        Symmetric{T, <:SparseMatrixCSC}}, b, u, Pl, Pr,
    maxiters::Int, abstol, reltol, verbose::Bool,
    assumptions::OperatorAssumptions) where {T}
    newA = copy(convert(AbstractMatrix, A))
    do_factorization(alg, newA, b, u)
end

e.g. with:

function init_cacheval(alg::GenericFactorization{typeof(svd!)}, A::AbstractMatrix, b, u, Pl, Pr,
    maxiters::Int,
    abstol, reltol, verbose::Bool, assumptions::OperatorAssumptions)
    ArrayInterface.svd_instance(A)
end

I'm not sure how to resolve these, I'm assuming we want as little do_factorization as possible?

Third, for some special matrix types, such as diagonal, there is dispatch to always use the same method, no matter the algorithm. But there are also fallback methods for most algorithms to work for general A. This design choice leads to tons of additional function definitions to resolve ambiguities between the two.

codecov[bot] commented 9 months ago

Codecov Report

Attention: 47 lines in your changes are missing coverage. Please review.

Comparison is base (f87583e) 66.50% compared to head (3099e72) 65.44%.

Files Patch % Lines
src/factorization.jl 12.24% 43 Missing :warning:
src/default.jl 63.63% 4 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #437 +/- ## ========================================== - Coverage 66.50% 65.44% -1.06% ========================================== Files 27 27 Lines 2039 2072 +33 ========================================== Hits 1356 1356 - Misses 683 716 +33 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

ChrisRackauckas commented 8 months ago

Resolve the conflicts?

ChrisRackauckas commented 8 months ago

@avik-pal can you review this one?

avik-pal commented 8 months ago

When can issquare be nothing? If we can't resolve the dimensions can't we assume rectangular

avik-pal commented 8 months ago

Oh I see it is to disambiguate between no assumption provided from the user. This PR seems fine in that case