SciML / DASSL.jl

Solves stiff differential algebraic equations (DAE) using variable stepsize backwards finite difference formula (BDF) in the SciML scientific machine learning organization
https://benchmarks.sciml.ai/
Other
33 stars 17 forks source link

Improve efficiency #26

Open MartinOtter opened 5 years ago

MartinOtter commented 5 years ago

File src/DASSL.jl line 686 has the function:

# generate a function that computes approximate jacobian using forward
# finite differences
function numerical_jacobian(F,reltol,abstol,weights)
    function numjac(t, y, dy, a)
        ep      = eps(one(t))   # this is the machine epsilon
        h       = 1/a           # h ~ 1/a
        wt      = weights(y,reltol,abstol)
        # delta for approximation of jacobian.  I removed the
        # sign(h_next*dy0) from the definition of delta because it was
        # causing trouble when dy0==0 (which happens for ord==1)
        edelta  = diagm(0=>max.(abs.(y),abs.(h*dy),wt)*sqrt(ep))

        b=dy-a*y
        f(y1) = F(t,y1,a*y1+b)

        n   = length(y)
        jac = Array{eltype(y)}(undef,n,n)
        for i=1:n
            jac[:,i]=(f(y+edelta[:,i])-f(y))/edelta[i,i]
        end

        jac
    end
end

This (central) function should be improved:

  1. In the for loop f(y) is called in every iteration, although y does not change. This means that there are n-1 unnecessary calls of the right hand side function. The call of f(y) should be moved out of the for loop and the result stored in a vector.

  2. The statement jac = Array{eltype(y)}(undef,n,n) generates the Jacobian matrix whenever this function is called. So, when this function is called 100 times, then memory for 100 Jacobians are allocated. This should be improved: At the start of the simulation, memory for one Jacobian should be allocated that is then updated within this function.

ChrisRackauckas commented 5 years ago

There's tons of other things that can be done here as well. Lots of allocations to get rid of, overloads for user-Jacobians, etc. But I don't plan on working on it at all because this is somewhat of a dead-end development-wise. Essentially, we'd have to add every feature of OrdinaryDiffEq.jl here, or just add one function to OrdinaryDiffEq.jl and it'll get all of the features, so we're doing the later. So this code has just been kept pretty much in tact from its original creation (way before I started using Julia, this is "inherited code"), and sooner rather than later (a GSoC is slated for it) we'll have OrdinaryDiffEq.jl handle it, and this code will be deprecated. If you want to work on it a bit that's fine, but I don't plan to use of my own time improving this codebase myself.

MartinOtter commented 5 years ago

[...] sooner rather than later (a GSoC is slated for it) we'll have OrdinaryDiffEq.jl handle it

Hm. Just to make sure that I understand correctly: DASSL.jl will be replaced by a new implementation within OrdinaryDiffEq.jl? So, it is not worth to put effort in improving DASSL.jl?

ChrisRackauckas commented 5 years ago

Indeed. OrdinaryDiffEq already does a few mass-matrix ODEs, and it just needs a small tweak to allow DAEProblems. Once that's done, someone just needs to make a perform_step! overload for how DAE BDF works, and then it will have all of the events, GPU, Newton-Krylov, adaptivity, etc. support for free, which is why we do it that way. And we need the consistent initial condition finder from DASSL.

So I think that a better use of time would be to just find out why QNDF is not performing as well as we'd like and maybe implement a divided differences form as well. But for example, the whole code for QNDF is https://github.com/JuliaDiffEq/OrdinaryDiffEq.jl/blob/master/src/perform_step/bdf_perform_step.jl#L739-L878

which is much easier to maintain and give features to rather than having a library per method (we are at 300 methods, so...)

I'll open a new issue in OrdinaryDiffEq.jl to discuss the BDF differences. @YingboMa and I have been talking about it a lot internally but we should start writing things down.