JuliaSparse / KLU.jl

Julia Wrapper for the KLU sparse matrix solver from SuiteSparse
https://klu.juliasparse.org/
MIT License
26 stars 6 forks source link

Support view of 2D slices #11

Open mzy2240 opened 1 year ago

mzy2240 commented 1 year ago
A = sprand(10,10, 0.5)
b = rand(10)
copy_b = copy(b)
ft = klu(A[1:8, 1:8]);
KLU.solve!(ft, @view b[1:8]);

The above code works. If b is a 2d vector or even whole view @view b[:,:] is used, it also works fine. However, if @view b[indexes, indexes] is passed, the following error will occur: image

rayegun commented 1 year ago

So it has to be contiguous. I can support passing a column of a matrix: @view A[:, index]

I'll add that

mzy2240 commented 1 year ago

Will it be useful if adding the support of @view A[:,:]? Though I am not sure how much time it will save for a 100k * 100k sparse matrix.

rayegun commented 1 year ago

What do you mean by @view A[:, :]? We can't take pass views of A, and we can only pass views of b that are column major and contiguous. The C library must receive exactly the arrays it expects. The KLU port will perhaps be slightly more flexible.

mzy2240 commented 1 year ago

Thanks for the explanation. That makes sense.

Amuri44 commented 1 year ago

I think my problem is similar to above. However, I could not get the point exactly.

Why the sliced matrix by view is not supported by klu as in the below?

A = sprand(10,10, 0.5)
julia> klu(@view(A[1:5,1:5]))
ERROR: MethodError: no method matching klu(::SubArray{Float64, 2, SparseMatrixCSC{Float64, Int64}, Tuple{UnitRange{Int64}, UnitRange{Int64}}, false})
rayegun commented 1 year ago

This is not really similar, we were talking about views of dense matrices. In that case we can't just use the view as is, because the dense storage is not contiguous.

The problem you're describing is slightly different, it's a view of a sparse matrix. But I'll just handle both by copying.

Amuri44 commented 1 year ago

Thank you! FYI, I am putting this Mishandling of lu on view of SparseMatrixCSC here as a reference if it can support in any way.

I have one concern please. The solution of the linear equation Ax=b in LU factorization is x=U\L\b given that LU=lu(A). I am wondering what is it in KLU cause there is F, the upper triangular of the off-diagonal blocks? In other way, I stopped at this point x=(LU+F)\b cause this is heavier in computation than A\b