JuliaDiff / SparseDiffTools.jl

Fast jacobian computation through sparsity exploitation and matrix coloring
MIT License
237 stars 41 forks source link

Extend JacVec to have a speedy solution also for matrix multiplication? #243

Open david-hofmann opened 1 year ago

david-hofmann commented 1 year ago

Is there a chance to expand the code base to support matrix multiplications with the Jacbian? Here is an MWE that shows what I am trying to accomplish based on the example from the documentation

using SparseDiffTools
using ForwardDiff
using BenchmarkTools

fcalls = 0
function g(x)
    global fcalls += 1
    y = zero(x)
    for i in 2:length(x)-1
      y[i] = x[i-1] - 2x[i] + x[i+1]
    end
    y[1] = -2x[1] + x[2]
    y[end] = x[end-1] - 2x[end]
    y
end

x = rand(30)
J = JacVec(g, x)
V = rand(30, 50)

@benchmark for i = 1:size(V, 2)
    J*V[:, i]
end

@benchmark ForwardDiff.jacobian(g, x)*V

The current solution is to loop over all column vectors which makes this solution slower than computing the Jacobian than with for instance ForwardDiff.jl and then multiply instead of this Jacobian free solution (if the number of dimensions is small enough as in the example). If anyone has a faster solution based on the existing code-base I'd be very grateful.

gdalle commented 1 week ago

After a certain point it becomes more interesting to compute the sparse Jacobian J once and for all instead of iterating matrix-vector products with a lazy operator. The turning point will depend on the number of columns in the matrix V

ChrisRackauckas commented 1 week ago

That's a non-solution since there's also a lot of situations where you need the operator like in JFNK

gdalle commented 1 week ago

For the noobs among us, what is JFNK?

ChrisRackauckas commented 1 week ago

Jacobian-free Newton Krylov