Closed ranjanan closed 4 years ago
This is on DiffEqFlux 1.8.0
I'm working on a PR for this
Hi @ranjanan,
using DiffEqFlux, OrdinaryDiffEq, Flux, Optim, Plots
u0 = Float32[2.; 0.]
datasize = 30
tspan = (0.0f0,1.5f0)
function trueODEfunc(du,u,p,t) true_A = [-0.1 2.0; -2.0 -0.1]
du .= ((u.^3)'true_A)' end
t = range(tspan[1],tspan[2],length=datasize)
prob = ODEProblem(trueODEfunc,u0,tspan)
ode_data = Array(solve(prob,Tsit5(),saveat=t))
dudt2 = FastChain((x,p) -> x.^3,
FastDense(2,50,tanh),
FastDense(50,2))
n_ode = NeuralODE(dudt2,tspan,Tsit5(),saveat=t)
function predict_n_ode(p)
n_ode(eltype(p).(u0),p)
end
function loss_n_ode(p)
pred = predict_n_ode(p)
loss = sum(abs2,ode_data .- pred)
loss,pred
end
loss_n_ode(n_ode.p) # n_ode.p stores the initial parameters of the neural ODE
# Display the ODE with the initial parameter values. cb(n_ode.p,loss_n_ode(n_ode.p)...)
res1 = DiffEqFlux.sciml_train(loss_n_ode, DiffEqFlux.BBO(), maxiters = 300,lower_bounds = [-1000.0 for i in 1:252], upper_bounds = [1000.0 for i in 1:252])
This works, in your script you are passing an initial parameter which is not needed, callbacks are not supported with BBO and the bounds arguments are missing.
Can we figure out a way to make callbacks work with BBO? I thought there was a mechanism? Otherwise we should probably do some upstream changes.
I was calling this wrong, I changed to this:
res1 = DiffEqFlux.sciml_train(loss_n_ode, DiffEqFlux.BBO(), lower_bounds = [-1 for _ in 1:length(n_ode.p)],
upper_bounds = [1. for _ in 1:length(n_ode.p)], cb = cb, maxiters = 300)
and I get:
julia> include("test.jl")
321.23785f0
ERROR: LoadError: ArgumentError: Using Array{Tuple{Int64,Float64},1} for SearchRange is not supported.
Stacktrace:
[1] check_and_create_search_space(::DictChain{Symbol,Any}) at /home/ranjan/.julia/packages/BlackBoxOptim/ZdVko/src/default_parameters.jl:71
[2] setup_problem(::Function, ::DictChain{Symbol,Any}) at /home/ranjan/.julia/packages/BlackBoxOptim/ZdVko/src/bboptimize.jl:27
[3] #bbsetup#86(::Base.Iterators.Pairs{Symbol,Any,NTuple{4,Symbol},NamedTuple{(:Method, :SearchRange, :MaxSteps, :cb),Tuple{Symbol,Array{Tuple{Int64,Float64},1},Int64,var"#82#84"}}}, ::typeof(bbsetup), ::Function, ::Dict{Symbol,Any}) at /home/ranjan/.julia/packages/BlackBoxOptim/ZdVko/src/bboptimize.jl:87
[4] #bbsetup at ./none:0 [inlined]
[5] #bboptimize#85(::Base.Iterators.Pairs{Symbol,Any,NTuple{4,Symbol},NamedTuple{(:Method, :SearchRange, :MaxSteps, :cb),Tuple{Symbol,Array{Tuple{Int64,Float64},1},Int64,var"#82#84"}}}, ::typeof(bboptimize), ::Function, ::Dict{Symbol,Any}) at /home/ranjan/.julia/packages/BlackBoxOptim/ZdVko/src/bboptimize.jl:70
[6] #bboptimize at ./none:0 [inlined] (repeats 2 times)
[7] #sciml_train#216(::Array{Int64,1}, ::Array{Float64,1}, ::Int64, ::Base.Iterators.Pairs{Symbol,var"#82#84",Tuple{Symbol},NamedTuple{(:cb,),Tuple{var"#82#84"}}}, ::typeof(DiffEqFlux.sciml_train), ::Function, ::DiffEqFlux.BBO, ::Base.Iterators.Cycle{Tuple{DiffEqFlux.NullData}}) at /home/ranjan/.julia/dev/DiffEqFlux/src/train.jl:328
[8] (::DiffEqFlux.var"#kw##sciml_train")(::NamedTuple{(:lower_bounds, :upper_bounds, :cb, :maxiters),Tuple{Array{Int64,1},Array{Float64,1},var"#82#84",Int64}}, ::typeof(DiffEqFlux.sciml_train), ::Function, ::DiffEqFlux.BBO) at none:0
[9] top-level scope at /home/ranjan/.julia/dev/ARPAEMERL/test/test.jl:46
[10] include at ./boot.jl:328 [inlined]
[11] include_relative(::Module, ::String) at ./loading.jl:1105
[12] include(::Module, ::String) at ./Base.jl:31
[13] include(::String) at ./client.jl:424
[14] top-level scope at REPL[12]:1
in expression starting at /home/ranjan/.julia/dev/ARPAEMERL/test/test.jl:46
It doesn't make sense to bound the parameters of a neural network to have to be less than 1. I'm sure that's not what you meant?
Array{Tuple{Int64,Float64},1}
you'll need to pass both the upper and lower bound of the same type
No this is my bad again, I got it to work. However the API is different from the other DiffEqFlux functions, perhaps it should be more like:
sciml_train(loss, _theta, BBO(;kwargs); kwargs)
Instead its now
sciml_train(loss, BBO(), lower_bound, uppper_bound)
At minimum this needs an error message
Yes, it should error if you don't pass both bounds. The kwargs should have no default and that would make Julia throw an error message, so something needs to change in our implementation.
perhaps it should be more like:
It should be more like that: right now is just version 1.
We need to really clean up and document how we handle box constraints.
I agree that having two different interfaces is a bit awkward, but this is due to the fact that BlackBoxOptim doesn't take an initial parameter value.
The kwargs should have no default and that would make Julia throw an error message,
@ChrisRackauckas it is implemented that way
PR nicoming
Interesting. It should throw a lower_bound not passed
kind of error message then?
Yes it does
The bounds were passed, the issue was that they were not the same type
Yes, in my case, I made the above error ^. Happy to bikeshed on the API in #200
Error: