Closed schlichtanders closed 2 years ago
Looks like there is an issue with the setting of the Enzyme activity I need to handle here, but @wsmoses this also is a case of an uncaught LLVM assertion.
Is the issue that you're running an older DiffEqSensitivity and not SciMLSensitivity? There was a name change when we did the breaking update to v7 in June. When I updated the script it seems to run fine:
import DifferentialEquations, SciMLSensitivity
import Optimization, OptimizationOptimisers
using DifferentialEquations: solve
rng = Random.default_rng()
Random.seed!(rng, 12345)
function lotka_volterra(du, u, p, t)
x, y = u
α, β, δ, γ = p
du[1] = dx = α*x - β*x*y
du[2] = dy = -δ*y + γ*x*y
end
u0 = [1.0, 1.0]
tspan = (0.0, 10.0)
p = [1.5, 1.0, 3.0, 1.0]
ode_prob = DifferentialEquations.ODEProblem(lotka_volterra, u0, tspan, p)
ode_sol = solve(ode_prob, saveat=0.1)
function predict(parameters, ode_prob=ode_prob, t=ode_sol.t)
solve(ode_prob, saveat = t, p = parameters)
end
function loss_function(parameters, data)
pred = Array(predict(parameters))[1,:]
return sum(abs2, pred .- data)
end
ps_initial = ode_prob.p
data = 1.0
loss_function(ps_initial, data)
losses = Float64[]
function callback(p, l)
push!(losses, l)
if length(losses) % 50 == 0
#Plots.plot(losses, show = :inline, yscale = :log10,
# label = "loss", xlabel = "#epochs", ylabel="loss (log10 scale)")
end
return false # return bool `halt`
end
ps_trained = let data = data
minimizer = ps_initial
opt_function = Optimization.OptimizationFunction(
(ps, data) -> loss_function(ps, data),
Optimization.AutoZygote(),
)
for (optimizer, maxiters) = [
(OptimizationOptimisers.Adam(0.1), 300),
(OptimizationOptimisers.Adam(0.01), 500),
]
opt_prob = Optimization.OptimizationProblem(opt_function, minimizer, data)
opt_sol = solve(opt_prob, optimizer,
callback = callback, maxiters = maxiters)
minimizer = opt_sol.minimizer
end
minimizer
end
Are there any tutorials left out there that mention DiffEqSensitivity? I think all of those should've been updated already, but let me know if I missed one.
thank you very much for your help - yes switching to SciMLSensitivity worked for me. You can close the ticket if you don't need it to track other problems related to the original failure.
Thanks, and thanks for pointing out the old UDE tutorials. We just changed the SciMLSensitivity docs to build via Buildkite so that we could use more dedicated compute resources (these docs are big 😅) so hopefully we will soon merge the UDE paper repo pieces into tested tutorials here.
Hi there, I am starting with SciML and getting a core dumped when switching from julia 1.7.2 to julia 1.8.1
It is a simple combination of ODEProblem and Optimization, worked before, now fails dramatically.
This will throw
julia: /workspace/srcdir/Enzyme/enzyme/Enzyme/GradientUtils.h:2093: llvm::SmallVector<llvm::SelectInst*, 4> DiffeGradientUtils::addToDiffe(llvm::Value*, llvm::Value*, llvm::IRBuilder<>&, llvm::Type*, llvm::ArrayRef<llvm::Value*>, llvm::Value*): Assertion
!isConstantValue(val)' failed.`here all the details:
Here my manifest.toml (as txt because github won't allow .toml files): Manifest.toml.txt