Closed Vaibhavdixit02 closed 1 year ago
Merging #565 (c2218b5) into master (edad8dd) will decrease coverage by
0.57%
. The diff coverage is0.00%
.
@@ Coverage Diff @@
## master #565 +/- ##
=========================================
- Coverage 10.27% 9.71% -0.57%
=========================================
Files 41 41
Lines 2374 2367 -7
=========================================
- Hits 244 230 -14
- Misses 2130 2137 +7
Impacted Files | Coverage ฮ | |
---|---|---|
ext/OptimizationEnzymeExt.jl | 0.00% <0.00%> (รธ) |
... and 3 files with indirect coverage changes
:mega: Weโre building smart automated test selection to slash your CI/CD build times. Learn more
Failures are only reproducible when running ]test
but work when running the code directly ๐
Try forcing Enzyme v0.11.0 for now. SciMLSensitivity was showing issues on later versions that haven't been isolated yet.
Alternatively @Vaibhavdixit02 if you can isolate the failure and post as an issue, we can fix it!
@wsmoses I guess the latest (0.11.5) release's error is only relevant? Btw this error is there on CI if you can take a quick look in case that's enough to diagnose, because as I mentioned this is showing up only on running ]test
but not on running the code directly...
Like even just doing Julia --project testfile.jl (assuming you add the relevant test packages)?
I might start at that and then remove the ones that don't cause the failure.
For now ignore the runtimeActivity one, but isolating the zero one would be especially useful -- in particular for @ChrisRackauckas
@Vaibhavdixit02 I was looking into this and if I have an env with
(foo) pkg> st
Status `~/.julia/dev/Optimization/foo/Project.toml`
[7da242da] Enzyme v0.11.5
[7f7a1694] Optimization v3.15.2 `..#enzymebump`
And try to run
using Optimization
using Enzyme
rosenbrock(x, p = nothing) = (1 - x[1])^2 + 100 * (x[2] - x[1]^2)^2
function con2_c(res, x, p)
res .= [x[1]^2 + x[2]^2, x[2] * sin(x[1]) - x[1]]
end
x0 = zeros(2)
optf = OptimizationFunction(rosenbrock, Optimization.AutoEnzyme(), cons = con2_c)
optprob = Optimization.instantiate_function(optf, x0, Optimization.AutoEnzyme(),
nothing, 2)
I get
ERROR: ArgumentError: The passed automatic differentiation backend choice is not available. Please load the corresponding AD package Enzyme.
Stacktrace:
[1] instantiate_function(f::Function, x::Vector{Float64}, adtype::AutoEnzyme, p::Nothing, num_cons::Int64)
@ Optimization ~/.julia/dev/Optimization/src/function.jl:114
[2] top-level scope
@ ~/.julia/dev/Optimization/test/scratch.jl:16
Could it be that the the tests in general silently fail to execute Optimization.instantiate_function
and therefore the constraint jacobian stays at the zero initialization?
That doesn't look likely
julia> using Optimization
julia> using Enzyme
julia> rosenbrock(x, p = nothing) = (1 - x[1])^2 + 100 * (x[2] - x[1]^2)^2
rosenbrock (generic function with 2 methods)
julia> function con2_c(res, x, p)
res .= [x[1]^2 + x[2]^2, x[2] * sin(x[1]) - x[1]]
end
con2_c (generic function with 1 method)
julia> x0 = zeros(2)
2-element Vector{Float64}:
0.0
0.0
julia> optf = OptimizationFunction(rosenbrock, Optimization.AutoEnzyme(), cons = con2_c)
(::OptimizationFunction{true, AutoEnzyme, typeof(rosenbrock), Nothing, Nothing, Nothing, typeof(con2_c), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}) (generic function with 1 method)
julia> optprob = Optimization.instantiate_function(optf, x0, Optimization.AutoEnzyme(),
nothing, 2)
(::OptimizationFunction{true, AutoEnzyme, typeof(rosenbrock), Optimization.OptimizationEnzymeExt.var"#grad#18"{OptimizationFunction{true, AutoEnzyme, typeof(rosenbrock), Nothing, Nothing, Nothing, typeof(con2_c), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Optimization.OptimizationEnzymeExt.var"#1#17"{Nothing}}, Optimization.OptimizationEnzymeExt.var"#hess#21"{OptimizationFunction{true, AutoEnzyme, typeof(rosenbrock), Nothing, Nothing, Nothing, typeof(con2_c), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Optimization.OptimizationEnzymeExt.var"#g#20", Optimization.OptimizationEnzymeExt.var"#1#17"{Nothing}}, Optimization.OptimizationEnzymeExt.var"#6#26"{OptimizationFunction{true, AutoEnzyme, typeof(rosenbrock), Nothing, Nothing, Nothing, typeof(con2_c), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Optimization.OptimizationEnzymeExt.var"#1#17"{Nothing}}, Optimization.OptimizationEnzymeExt.var"#7#27"{OptimizationFunction{true, AutoEnzyme, typeof(rosenbrock), Nothing, Nothing, Nothing, typeof(con2_c), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}, Optimization.OptimizationEnzymeExt.var"#9#29", Optimization.OptimizationEnzymeExt.var"#13#33"{Int64, Vector{Optimization.OptimizationEnzymeExt.var"#12#32"{Int64, OptimizationFunction{true, AutoEnzyme, typeof(rosenbrock), Nothing, Nothing, Nothing, typeof(con2_c), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing, Int64}}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}) (generic function with 1 method)
(temp) pkg> st
Status `~/Optimization.jl/temp/Project.toml`
[7da242da] Enzyme v0.11.5
[7f7a1694] Optimization v3.15.2 `https://github.com/SciML/Optimization.jl.git#enzymebump`
That doesn't look likely
Yeah, don't know what was wrong there, now I can't reproduce anymore. But I can reproduce the issue via ]test
. So it seems very much like that something in./test/Project.toml
is introducing something funky.
And weirdly it even doesn't happen when I isolate the hessian implementation in a new package and run ]test
there so this is very hard to understand so far for me ๐
very hard to understand so far for me ๐
For me too! Unfortunately I'm not understanding enough about how the ./Project.toml
and ./test/Project.toml
actually interact for Pkg.test
. Also I'm not deep enough into the complex interactions of all the SciML packages/ extension patterns used to be of great help here. Maybe @ChrisRackauckas has a hunch of what could be going on here?
Try moving the test out of the safetestset?
Not quite sure why, but apparently an unused using Tracker appears to cause the error (removing the using Tracker allows success).
#ADTests.jl
using Optimization, Test
using Tracker
using Enzyme, Random
x0 = zeros(2)
rosenbrock(x, p = nothing) = (1 - x[1])^2 + 100 * (x[2] - x[1]^2)^2
l1 = rosenbrock(x0)
function g!(G, x)
G[1] = -2.0 * (1.0 - x[1]) - 400.0 * (x[2] - x[1]^2) * x[1]
G[2] = 200.0 * (x[2] - x[1]^2)
end
function h!(H, x)
H[1, 1] = 2.0 - 400.0 * x[2] + 1200.0 * x[1]^2
H[1, 2] = -400.0 * x[1]
H[2, 1] = -400.0 * x[1]
H[2, 2] = 200.0
end
G1 = Array{Float64}(undef, 2)
G2 = Array{Float64}(undef, 2)
H1 = Array{Float64}(undef, 2, 2)
H2 = Array{Float64}(undef, 2, 2)
g!(G1, x0)
h!(H1, x0)
cons = (res, x, p) -> (res .= [x[1]^2 + x[2]^2])
G2 = Array{Float64}(undef, 2)
H2 = Array{Float64}(undef, 2, 2)
optf = OptimizationFunction(rosenbrock, Optimization.AutoEnzyme(), cons = cons)
optprob = Optimization.instantiate_function(optf, x0, Optimization.AutoEnzyme(),
nothing, 1)
# optprob.grad(G2, x0)
# @test G1 == G2
# optprob.hess(H2, x0)
# @test H1 == H2
# res = Array{Float64}(undef, 1)
# optprob.cons(res, x0)
# @test res == [0.0]
J = Array{Float64}(undef, 2)
optprob.cons_j(J, [5.0, 3.0])
@test J == [10.0, 6.0]
This is an issue in the KernelAbstractions extension package, which is transitively loaded via Tracker (see https://github.com/JuliaGPU/KernelAbstractions.jl/pull/412), hence loading tracker or not causes an error xD.
In any case it requires the fix to KA to land and be bumped, at least for your test to pass.
That's a very tricky one, how you isolated that is beyond me! Thanks a lot!
basically I played the "make the code, dependencies, etc" as minimal as possible (which doesn't mean as concise as possible, but as close to bare metal julia).
Fixes #564