JuliaLinearAlgebra / AlgebraicMultigrid.jl

Algebraic Multigrid in Julia
Other
116 stars 23 forks source link

Use sparse LU as a faster coarse solver than Pinv #94

Closed learning-chip closed 11 months ago

learning-chip commented 2 years ago

The only available CoarseSolver is Pinv, which gets very slow if the coarsest-level A is large, for example in a two-grid setting instead of multigrid V-cycle. I wrote a code demo to use UMFPACK sparse LU to solve the coarse equation. It shows significant speed-up in both AMG setup phase and solve phase. I can make a PR if useful.

Current performance with dense Pinv

import Random: seed!
using SparseArrays
using LinearAlgebra
using AlgebraicMultigrid
import AlgebraicMultigrid as AMG
using BenchmarkTools

# Poisson matrix also makes the point, but the second-level A with Ruge-Stuben coarsening is a bit too dense,
# so the gap between sparse and dense solves is not significant enough.
# Here use a very sparse random matrix that lead to a sparse-enough second level.
# A = poisson((64, 64))
seed!(0)
A = sprand(8000, 8000, 1e-4) * 0.1 + I  #  add to diagonal to avoid singular matrix
b = A * ones(size(A)[1])

ruge_stuben(A, max_levels=10)  # fast if coarsest A is very small
@time ml = ruge_stuben(A, max_levels=2)  # takes 5 seconds, mostly spends on `inv(Matrix(ml.final_A))`
@btime AMG._solve(ml, b, maxiter=10);  # takes 16 ms

Use Sparse LU to improve performance

import SuiteSparse.UMFPACK: UmfpackLU
import AlgebraicMultigrid: CoarseSolver

# note: UMFPACK only supports Float64 or ComplexF64
# not sure how to best restrict the types
struct Splu <: CoarseSolver
    LU::UmfpackLU
    Splu(A) = new(lu(A))
end

Base.show(io::IO, p::Splu) = print(io, "Splu")

function (p::Splu)(x, b)
    x[:] = p.LU \ b
end

ml_sp = ruge_stuben(A, max_levels=2, coarse_solver=Splu)  
@btime ruge_stuben(A, max_levels=2, coarse_solver=Splu)  # takes 6 ms, ~1000x faster than before!
@btime AMG._solve(ml_sp, b, maxiter=10);  # takes 9 ms, 2x faster than before
ChrisRackauckas commented 2 years ago

Instead, it should just be setup to use LinearSolve.jl so that downstream users can choose which linear solver to use in this step.

learning-chip commented 2 years ago

it should just be setup to use LinearSolve.jl

Interesting, then I would argue that presmoother and postsmoother should also be flexible, generic approximate/iterative solvers. For complicated (say indefinite, nonsymmetric) problems, it makes sense to use GMRES as smoother (as an option in PyAMG) or ILU as smoother. Many are defined in IterativeSolvers.jl or other existing solver packages, but currently AMG.jl only allows GaussSeidel and Jacobi.

How should the new interface look like then? What would be the first step to implement those?

ChrisRackauckas commented 2 years ago

How should the new interface look like then?

ruge_stuben(A, max_levels=2, linsolve = KLUFactorization(), presmoother = Krylov_GMRES()) just passing methods for the solvers http://linearsolve.sciml.ai/dev/solvers/solvers/.

ViralBShah commented 1 year ago

Happy to give folks commit access to the repo, if they have bandwidth to make these improvements.