Closed MikeInnes closed 5 years ago
Except it's not broken and it is tested? https://github.com/JuliaDiffEq/DiffEqSensitivity.jl/blob/master/test/adjoint.jl#L73-L117
What code did you run?
julia> function lotka_volterra(du,u,p,t)
x, y = u
α, β, δ, γ = p
du[1] = dx = (α - β*y)x*t
du[2] = dy = (δ*x - γ)y
end
julia> p = [2.2, 1.0, 2.0, 0.4]
julia> prob = ODEProblem(lotka_volterra,[1.0,1.0],(0.0,10.0), p)
julia> sol = solve(prob,Vern9())
julia> function dg(out,u,p,t,i)
out .= 1 .- u
end
julia> adjoint_sensitivities(sol,Vern9(),dg)
ERROR: type ODEAdjointSensitivityFunction has no field g_gradient_config
My example might be bad, in which case it's just a case of throwing a different error. But this line refers to a field that doesn't exist, so something has gone wrong in that code.
You didn't supply a g
, and dg
is in the form for a discrete sensitivity. This is just a case of RTFM
If t is not provided: julia> adjoint_sensitivities(sol,Vern9(),dg)
For adjoint_sensitivities
, the syntax for continuous is adjoint_sensitivities(sol,Vern9(),g,nothing,dg)
where g(u,p,t)
is the functional and dg(out,u,p,t)
is its gradient w.r.t. u
.
I didn't open this issue because I want a continuous sensitivity, I opened it because this example (even if not correct) triggers obviously broken code in the package, rather than throwing a sensible error. Why is that branch even there if it only gets triggered by incorrect user input? It should clearly either be removed entirely or fixed and tested.
Okay, it looks like https://github.com/JuliaDiffEq/DiffEqSensitivity.jl/commit/3e101221bee9889caf64c5685d9303f1108412a2 broke continuous adjoint sensitivities in the case where the user doesn't provide a gradient function. the gradient config needs to be re-added. The deletes went too far and deleted more than just the ReverseDiff usage.
That commit deleted g_grad_config
, which was unused, not g_gradient_config
, which never existed in the first place. So it was broken before, though of course the correct fix would have been to rename one to the other.
Should be easy to use Flux here at the point where we need the gradient. If you can provide a test case I'm also happy to look at it myself.
If
t
is not provided:It would be useful for me to understand how the continuous case is different from the discrete one; at present it doesn't appear to be tested and has bitrotted.