Open ghost opened 1 year ago
using ComponentArrays
function predict_adjoint(fullp, time_batch)
Array(solve(prob, Tsit5(), p = ComponentArray(fullp), saveat = time_batch))
end
Thanks for the input!
However, I still get an error:
ERROR: MethodError: no method matching copy(::NamedTuple{(:layer_1, :layer_2), Tuple{NamedTuple{(:weight, :bias), Tuple{Matrix{Float32}, Matrix{Float32}}}, NamedTuple{(:weight, :bias), Tuple{Matrix{Float32}, Matrix{Float32}}}}})
Closest candidates are:
copy(::Union{DiffEqNoiseProcess.BoxWedgeTail, DiffEqNoiseProcess.NoiseApproximation, DiffEqNoiseProcess.NoiseGrid, DiffEqNoiseProcess.NoiseWrapper, DiffEqNoiseProcess.VirtualBrownianTree})
@ DiffEqNoiseProcess ~/.julia/packages/DiffEqNoiseProcess/VQe6Y/src/copy_noise_types.jl:55
copy(::Random123.Threefry4x{T, R}) where {T, R}
@ Random123 ~/.julia/packages/Random123/u5oEp/src/threefry.jl:266
copy(::Zygote.Buffer)
@ Zygote ~/.julia/packages/Zygote/JeHtr/src/tools/buffer.jl:64
...
Edit: I think I got it to work by using a different approach.
I left the predict_adjoint
method as is:
function predict_adjoint(fullp, time_batch)
Array(solve(prob, Tsit5(), p = fullp, saveat=time_batch))
end
And changed the definition of the optprob
to be:
optprob = OptimizationProblem(optfun, ComponentArray(pp))
Now it runs
@Vaibhavdixit02 can you put this to the docs?
Following the documentation in https://docs.sciml.ai/Optimization/stable/tutorials/minibatch/#Data-Iterators-and-Minibatching.
I tried to replace the
Flux
library withLux
as:However, this yields an error:
Would you consider introducing documentation on how to use mini-batches with
Lux
? This is a library used for universal differential equations and it would be useful to use this approach to trainUODE
s with different initial conditions/ parameters.