Closed erlebach closed 1 year ago
Can you retry this on latest Enzyme main to confirm. cc @ChrisRackauckas
How do I do that using the package manager? You'll notice that I do not add enzyme specifically to my project. I did a package update, which updated Flux
and DiffEqFlux
. However, looking at the Manifest, which I will attach here along with my latest Project.toml (I uploaded a zip file with both), there was no change to Zygote.
I could also try replacing sciml_train
with the proper version from Optimization.jl
?
Perhaps you are asking @ChrisRackauckas to update DiffEqFLux with the latest version of Zygote.jl? From the Manifest:
deps = ["Adapt", "Cassette", "ChainRulesCore", "ConsoleProgressMonitor", "D ataInterpolations", "DiffEqBase", "DiffResults", "Distributions", "Distribu tionsAD", "Flux", "ForwardDiff", "LinearAlgebra", "Logging", "LoggingExtras ", "Lux", "NNlib", "Optim", "Optimization", "OptimizationFlux", "Optimizati onOptimJL", "OptimizationPolyalgorithms", "Printf", "ProgressLogging", "Ran dom", "RecursiveArrayTools", "Reexport", "Requires", "SciMLBase", "SciMLSen sitivity", "StaticArrays", "TerminalLoggers", "Zygote", "ZygoteRules"]
[[deps.Zygote]]
deps = ["AbstractFFTs", "ChainRules", "ChainRulesCore", "DiffRules", "Distributed", "FillArrays", "ForwardDiff", "GPUArrays", "GPUArraysCore", "IRTools", "InteractiveUtils", "LinearAlgebra", "LogExpFunctions", "MacroTools", "NaNMath", "Random", "Requires", "SparseArrays", "SpecialFunctions", "Statistics", "ZygoteRules"]
git-tree-sha1 = "a6f1287943ac05fae56fa06049d1a7846dfbc65f"
uuid = "e88e6eb3-aa80-5325-afca-941959d7151f"
version = "0.6.51"
Zygote.jl is at version 0.10.x, while DiffEqFlux uses Zygote of an earlier version.
Finally, I just created a new project, and added only Zygote.jl
to the Project.toml
file. I got Zygote version 0.6.51 and not the latest version. That would imply that the package manager is not retrieving the latest version of Zygote.jl
. Perhaps I am doing something wrong? I though the latest version would have been retrieved since there are other packages it could conflict with.
cc:@ChrisRackauckas
Thanks.
He means, do ]add Enzyme#main
and see if you get an error instead of a segfault. Enzyme has some commits on main which aren't tagged.
OK. I did as requested and got: the error:
WARNING: redefinition of constant host_platform. This may fail, cause
incorrect answers, or produce other errors.
┌ Error: Error watching manifest
│ exception =
│ MethodError: no method matching
(::Enzyme_jll.var"#make_wrapper_dict#4"{Enzyme_jll.var"#parse_wrapper_platform#3"})(::String,
::Vector{String})
│ Stacktrace:
│ [1] top-level scope
│ @ ~/.julia/packages/JLLWrappers/QpMQW/src/toplevel_generators.jl:156
│ Revise evaluation error at
/Users/erlebach/.julia/packages/JLLWrappers/QpMQW/src/toplevel_generators.jl:156
│
│ Stacktrace:
│ [1] methods_by_execution!(recurse::Any,
methodinfo::Revise.CodeTrackingMethodInfo, docexprs::Dict{Module,
Vector{Expr}}, mod::Module, ex::Expr; mode::Symbol, disablebp::Bool,
always_rethrow::Bool, kwargs::Base.Pairs{Symbol, Union{}, Tuple{},
NamedTuple{(), Tuple{}}})
│ @ Revise ~/.julia/packages/Revise/do2nH/src/lowered.jl:227
└ @ Revise ~/.julia/packages/Revise/do2nH/src/pkgs.jl:477
although installation seemed to proceed nonetheless. It is now precompiling
"stuff", including DiffEqFlux
.
I notice that only deps.SciMLSensitivity
uses Enzyme.jl
.
DiffEqFlux must have been upgraded, because AMSGrad
is no longer
exported. I replaced by DiffEqFlux.AMSGrad
.
I am running the command:
result_univ = DiffEqFlux.sciml_train(loss_fn, θi,
DiffEqFlux.AMSGrad(),
cb = cb_fun,
allow_f_increases = false,
maxiters = 5)
Yes! I got a regular stack trace. Here it is:
WARNING: both OptimizationOptimisers and Flux export "AMSGrad"; uses of it
in module Main must be qualified
ERROR: UndefVarError: AMSGrad not defined
Stacktrace:
[1] top-level scope
@ ~/src/2022/rude/giesekus/rude.jl:324
┌ Warning: sciml_train is being deprecated in favor of direct usage of
Optimization.jl. Please consult the Optimization.jl documentation for more
details. Optimization.jl's PolyOpt solver is the polyalgorithm of
sciml_train
└ @ DiffEqFlux ~/.julia/packages/DiffEqFlux/2IJEZ/src/train.jl:6
ERROR: UndefVarError: callback not defined
Stacktrace:
[1] cb_fun(θ::Vector{Float64}, l::Float64)
@ Main ~/src/2022/rude/giesekus/rude.jl:323
[2] macro expansion
@ ~/.julia/packages/OptimizationFlux/zHcx5/src/OptimizationFlux.jl:34
[inlined]
[3] macro expansion
@ ~/.julia/packages/Optimization/o00ZS/src/utils.jl:37 [inlined]
[4] __solve(prob::OptimizationProblem{true, OptimizationFunction{true,
Optimization.AutoZygote, DiffEqFlux.var"#93#100"{typeof(loss_fn)}, Nothing,
Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing,
Nothing, Nothing, Nothing, Nothing,
typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing,
Nothing, Nothing, Nothing, Nothing}, Vector{Float64},
SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing,
Nothing, Base.Pairs{Symbol, Bool, Tuple{Symbol},
NamedTuple{(:allow_f_increases,), Tuple{Bool}}}},
opt::Flux.Optimise.AMSGrad,
data::Base.Iterators.Cycle{Tuple{Optimization.NullData}}; maxiters::Int64,
callback::Function, progress::Bool, save_best::Bool,
kwargs::Base.Pairs{Symbol, Bool, Tuple{Symbol},
NamedTuple{(:allow_f_increases,), Tuple{Bool}}})
@ OptimizationFlux
~/.julia/packages/OptimizationFlux/zHcx5/src/OptimizationFlux.jl:31
[5] #solve#536
@ ~/.julia/packages/SciMLBase/VKnrY/src/solve.jl:84 [inlined]
[6] sciml_train(::typeof(loss_fn), ::Vector{Float64},
::Flux.Optimise.AMSGrad, ::Nothing; lower_bounds::Nothing,
upper_bounds::Nothing, cb::Function, callback::Function, maxiters::Int64,
kwargs::Base.Pairs{Symbol, Bool, Tuple{Symbol},
NamedTuple{(:allow_f_increases,), Tuple{Bool}}})
@ DiffEqFlux ~/.julia/packages/DiffEqFlux/2IJEZ/src/train.jl:43
[7] top-level scope
@ ~/src/2022/rude/giesekus/rude.jl:324
On Fri, Dec 23, 2022 at 10:15 AM Christopher Rackauckas < @.***> wrote:
He means, do ]add Enzyme#main and see if you get an error instead of a segfault. Enzyme has some commits on main which aren't tagged.
— Reply to this email directly, view it on GitHub https://github.com/EnzymeAD/Enzyme.jl/issues/560#issuecomment-1364033411, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACPIZFBXSZ3WZ5QQYBXW7DWOW6ZJANCNFSM6AAAAAATHG6FSE . You are receiving this because you authored the thread.Message ID: @.***>
I tried replying by email, did not work. Here are the results of the request:
OK. I did as requested and got: the error:
┌ Error: Error watching manifest
│ exception =
│ MethodError: no method matching (::Enzyme_jll.var"#make_wrapper_dict#4"{Enzyme_jll.var"#parse_wrapper_platform#3"})(::String, ::Vector{String})
│ Stacktrace:
│ [1] top-level scope
│ @ ~/.julia/packages/JLLWrappers/QpMQW/src/toplevel_generators.jl:156
│ Revise evaluation error at /Users/erlebach/.julia/packages/JLLWrappers/QpMQW/src/toplevel_generators.jl:156
│
│ Stacktrace:
│ [1] methods_by_execution!(recurse::Any, methodinfo::Revise.CodeTrackingMethodInfo, docexprs::Dict{Module, Vector{Expr}}, mod::Module, ex::Expr; mode::Symbol, disablebp::Bool, always_rethrow::Bool, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
│ @ Revise ~/.julia/packages/Revise/do2nH/src/lowered.jl:227
└ @ Revise ~/.julia/packages/Revise/do2nH/src/pkgs.jl:477
although installation seemed to proceed nonetheless. It is now precompiling "stuff", including DiffEqFlux
.
I notice that only deps.SciMLSensitivity
uses Enzyme.jl
. DiffEqFlux must have been upgraded, because AMSGrad
is no longer exported. I replaced by DiffEqFlux.AMSGrad
. I am running the command:
result_univ = DiffEqFlux.sciml_train(loss_fn, θi,
DiffEqFlux.AMSGrad(),
cb = cb_fun,
allow_f_increases = false,
maxiters = 5)
Yes! I got a regular stack trace. Here it is:
``` WARNING: both OptimizationOptimisers and Flux export "AMSGrad"; uses of it in module Main must be qualified ERROR: UndefVarError: AMSGrad not defined Stacktrace: [1] top-level scope @ ~/src/2022/rude/giesekus/rude.jl:324
┌ Warning: sciml_train is being deprecated in favor of direct usage of Optimization.jl. Please consult the Optimization.jl documentation for more details. Optimization.jl's PolyOpt solver is the polyalgorithm of sciml_train └ @ DiffEqFlux ~/.julia/packages/DiffEqFlux/2IJEZ/src/train.jl:6 ERROR: UndefVarError: callback not defined Stacktrace: [1] cb_fun(θ::Vector{Float64}, l::Float64) @ Main ~/src/2022/rude/giesekus/rude.jl:323 [2] macro expansion @ ~/.julia/packages/OptimizationFlux/zHcx5/src/OptimizationFlux.jl:34 [inlined] [3] macro expansion @ ~/.julia/packages/Optimization/o00ZS/src/utils.jl:37 [inlined] [4] __solve(prob::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, DiffEqFlux.var"#93#100"{typeof(loss_fn)}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Bool, Tuple{Symbol}, NamedTuple{(:allow_f_increases,), Tuple{Bool}}}}, opt::Flux.Optimise.AMSGrad, data::Base.Iterators.Cycle{Tuple{Optimization.NullData}}; maxiters::Int64, callback::Function, progress::Bool, save_best::Bool, kwargs::Base.Pairs{Symbol, Bool, Tuple{Symbol}, NamedTuple{(:allow_f_increases,), Tuple{Bool}}}) @ OptimizationFlux ~/.julia/packages/OptimizationFlux/zHcx5/src/OptimizationFlux.jl:31 [5] #solve#536 @ ~/.julia/packages/SciMLBase/VKnrY/src/solve.jl:84 [inlined] [6] sciml_train(::typeof(loss_fn), ::Vector{Float64}, ::Flux.Optimise.AMSGrad, ::Nothing; lower_bounds::Nothing, upper_bounds::Nothing, cb::Function, callback::Function, maxiters::Int64, kwargs::Base.Pairs{Symbol, Bool, Tuple{Symbol}, NamedTuple{(:allow_f_increases,), Tuple{Bool}}}) @ DiffEqFlux ~/.julia/packages/DiffEqFlux/2IJEZ/src/train.jl:43 [7] top-level scope @ ~/src/2022/rude/giesekus/rude.jl:324
Status results:
Status ~/src/2022/rude/giesekus/Project.toml
[fbb218c0] BSON v0.3.6
[aae7a2af] DiffEqFlux v1.53.0
[0c46a032] DifferentialEquations v7.6.0
[7da242da] Enzyme v0.10.12 https://github.com/EnzymeAD/Enzyme.jl.git#main
[587475ba] Flux v0.13.10
[429524aa] Optim v1.7.4
[7f7a1694] Optimization v3.10.0
[36348300] OptimizationOptimJL v0.1.5
[42dfb2eb] OptimizationOptimisers v0.1.1
[1dea7af3] OrdinaryDiffEq v6.35.1
[91a5bcdd] Plots v1.38.0
[295af30f] Revise v3.4.0
[e88e6eb3] Zygote v0.6.51
[8bb1440f] DelimitedFiles
[37e2e46d] LinearAlgebra
(I did not include `using Enzyme` in my code). It is only used by `SciMLSensitivity.jl`. Here is the `Enzyme` entry in the `Manifest.toml` file:
[[deps.Enzyme]] deps = ["CEnum", "EnzymeCore", "Enzyme_jll", "GPUCompiler", "LLVM", "Libdl", "LinearAlgebra", "ObjectFile", "Oceananigans", "Printf", "Random"] git-tree-sha1 = "c2b739530f2c209a2629734edf4380e029e543f5" repo-rev = "main" repo-url = "https://github.com/EnzymeAD/Enzyme.jl.git" uuid = "7da242da-08ed-463a-9acd-ee780be4f1d9" version = "0.10.12"
That's good. This should probably be closed then as it seems Enzyme fixed this on main. @wsmoses any way we can get a tag?
We can continue updating the rest of your code in a different thread. Seems like it has some old stuff in there.
I could also try replacing sciml_train with the proper version from Optimization.jl?
Yes it throws a big warning saying that should be done. DiffEqFlux was simplified to just being about the machine learning layers, with the adjoints being in SciMLSensitivity (which will be loaded automatically on Julia v1.9 when weak dependencies comes out). So you want to using Optimization, OptimizationFlux
and then solve the optimization there. Let's follow up in a separate Discourse post so we don't bug Billy.
I am running a
rude.jl
from foldergiesekus
. The code generates an error with a stack trace of almost 1000 lines long. I attach the trace as a file and as well as the exact code I am running. Finally, I provide theProject.toml
file. I am running on Mac M1 with Ventura OS.stack_trace.txt
rude.jl
And finally, the packages loaded (Pkg.status):