Closed cuihantao closed 4 years ago
Is deciding whether to update the Jacobian up to Sundials.jl or Sundials IDA? I found from online at http://www.scholarpedia.org/article/Sundials_equation_solvers#IDA.2C_for_DAE_Systems, saying that The nonlinear iteration is Modified or Inexact Newton, and Jacobian information is updated only when necessary for convergence, not at every step.
It's up to Sundials IDA. However, you can probably trigger it by doing a reinit!
. That should happen naturally if you add a discrete callback at the time and then just do u_modified!(integrator,true)
. Then the reinit!
should require a new Jacobian to be drawn. A reinit is required anyways since the history values wouldn't be correct and it would need to hit back to first order.
Chris - Thanks for the rapid response. I will check reinit!
and see if I that can solve the problem. Closing the issue for now and will post updates.
Note you shouldn't have to call reinit!
, if you do:
condition(u,t,integ) = t in switch_times
affect!(integ) = nothing
cb = DiscreteCallback(condition,affect!)
sol = solve(prob,
IDA(linear_solver=:KLU, max_nonlinear_iters=15, max_convergence_failures=10, max_order=2),
dense=false, tstops=switch_times,
callback = cb,
);
should be sufficient, since having a callback means u
could change so it does a reinit to ensure consistency after user modifications.
Chris - it works like a charm! Thanks so much for pointing me to the solution!
Hi Chris, @ChrisRackauckas
I have a follow-up question related to updating the Jacobian but not using IDA. I was able to reformulate my problem as an ODE in the form of a mass matrix so that more solvers are available. Still, there are predefined times switch_times
when algebraic equations will change and variables will be reinitialized.
The issue I'm having is that the Jacobian function is not called at the switching times. It leads to convergence issues even after variables are correctly re-initialized since some Jacobian elements change from zero to non-zero after the switching.
The following code shows that the Jacobian is only updated at the beginning, although there is no convergence issue in this example.
function f(du,u,p,t)
du[1] = 2.0 * u[1] - 1.2 * u[1]*u[2]
du[2] = -3 * u[2] + u[1]*u[2]
println("* At t=$t, du updated")
end
function f_jac(J,u,p,t)
J[1,1] = 2.0 - 1.2 * u[2]
J[1,2] = -1.2 * u[1]
J[2,1] = 1 * u[2]
J[2,2] = -3 + u[1]
println("* At t=$t, J updated")
nothing
end
mass = [1 0; 0 1]
ff = ODEFunction(f; jac=f_jac, mass_matrix=mass)
switch_times = [1.0, 2.0]
condition(u, t, integ) = (t in switch_times)
affect!(integ) = nothing
cb = DiscreteCallback(condition, affect!);
prob = ODEProblem(ff, ones(2), (0.0,3.0), )
sol = solve(prob, Trapezoid(), tstops=switch_times, callback=cb)
The truncated output is
* At t=0.0, du updated
* At t=0.0, du updated
* At t=0.006565321642986127, du updated
* At t=0.0, J updated <-----------Jacobian updated here
* At t=0.001578760673188492, du updated
* At t=0.001578760673188492, du updated
* At t=0.001578760673188492, du updated
...
* At t=0.9181855190185823, du updated
* At t=0.9181855190185823, du updated
* At t=0.9181855190185823, du updated
* At t=1.0, du updated
* At t=1.0, du updated
* At t=1.0, du updated
* At t=1.0, du updated
* At t=1.0, du updated
* At t=1.0, du updated
* At t=1.0, du updated
* At t=1.0949235285493744, du updated
* At t=1.0949235285493744, du updated
...
* At t=1.9759066566592538, du updated
* At t=1.9759066566592538, du updated
* At t=1.9759066566592538, du updated
* At t=2.0, du updated
* At t=2.0, du updated
* At t=2.0, du updated
* At t=2.0, du updated
* At t=2.0, du updated
* At t=2.0746722193373897, du updated
* At t=2.0746722193373897, du updated
* At t=2.0746722193373897, du updated
* At t=2.0746722193373897, du updated
* At t=2.0746722193373897, du updated
...
I was not able to update the Jacobian manually in the affect!
function because J
is not a field of integrator
.
Could we add an option to force a Jacobian update at the first iteration of each time in tstops
? Further, could we add an option to force one Jacobian update at each iteration of each time in tstops
? The latter could improve the convergence (at a cost of Jacobian time).
The above are just based on my understanding as a user. Your suggestions are very welcome.
@cuihantao I see you are interested in power systems modeling using Sundials. We have a package https://github.com/Energy-MAC/LITS.jl which is being developed at this time to exploit DifferentialEquations.jl for Large Scale Power Systems modeling.
@cuihantao I see you are interested in power systems modeling using Sundials. We have a package https://github.com/Energy-MAC/LITS.jl which is being developed at this time to exploit DifferentialEquations.jl for Large Scale Power Systems modeling.
Thanks for the information! I have tried LITS.jl and read your paper. Sent you a separate email.
@cuihantao that's a known issue with reinitialization of mass matrix ODEs I hope we fix this summer.
@ChrisRackauckas Thanks, Chris. Mind pointing me to the files? I can tinker with it in the evenings.
it would be in https://github.com/SciML/OrdinaryDiffEq.jl/blob/master/src/integrators/integrator_utils.jl . I think @kanav99 might've had a version that was doing it? Basically, right now DAEs in OrdinaryDiffEq don't reinit after callbacks, but they should.
TL;DR: IDA won't converge after an algebraic equation switching, due to the fact that Jacobian is only updated once for each step.
Hi,
First of all, thanks for developing and supporting the DifferentialEquations.jl eco-system. I found this work impressive and have started trying to use it for power system simulation problems, which are formulated as index-1 DAEs with a large set of nonlinear algebraic equations. At predefined times, algebraic equations will be modified, such as removing part of the equation to simulate an out-of-service of a device.
My simulation setup is a hybrid of Python and Julia for now. The Python part implements the models, i.e., the functions that compute the residuals of
G(u, du, t) = res
and the Jacobians given bygamma * dG/d(du) + dG/du
. The changes of algebraic equations at predefined times are also implemented in Python because for now they are part of the model.The Julia part constructs the DAEFunction, DAEProblem and uses IDA for solving. The Julia part looks like
The switching event happens at the end of
t=2.0
.switch_times = [1.9999, 2.0, 2.0001]
to force the solver to stop right before, at and after the switching. I have tried to switch out difference devices and simulate the DAE problem. Results of the solved ones can match that from my Python solver implementing the Trapezoidal rule. But there are cases that IDA won't converge after the switching.I tried to print out the maximum mismatch (residual) from Julia and compare with my Python code. The debugging printouts and my interpretations are as follows.
h=0.000025
and updates the Jacobian once. Since the algebraic switching is drastic, the approximated Jacobian failed to make it converge ath=2.000025
. Notice the residuals are6.784984056352011
(iter. 1),0.3345103040021776
(iter. 2), and0.17705125103928232
(iter. 3).h=0.00000625
and updates the Jacobian once. No convergence either. Notice the residuals are6.784984056352011
(iter. 1),0.33448411910463083
(iter. 2), and0.17703102372314716
(iter. 3). The residuals are almost the same as that withh=h=2.000025
(with the tiny differences caused by the differential variables).I compared the residuals with my Python program. Residuals from iter. 1 and iter. 2 match identically with my Trapezoidal rule solver. Starting from iter. 3, IDA's residuals are much larger than that from Trapezoidal solver. Since the Jacobian only accurate to for iter. 1 and hardly usable for the subsequent iterations, it explains why IDA won't converge.
Questions:
Is deciding whether to update the Jacobian up to
Sundials.jl
or Sundials IDA? I found from online at http://www.scholarpedia.org/article/Sundials_equation_solvers#IDA.2C_for_DAE_Systems, saying thatThe nonlinear iteration is Modified or Inexact Newton, and Jacobian information is updated only when necessary for convergence, not at every step
.Is there any workaround? It seems that I cannot force a call to the
jac!
at each iteration, unless I use some messy global variables to store theJ
matrix and forcefully write to it in the residual function.Ideally, an additional logic for updating Jacobian can be implemented - either one Jacobian update per step, or one update per iteration. In the proximity of
tstops
or after a non-convergence, update the Jacobian per iteration.Thanks in advance for your time.