ChrisRackauckas / universal_differential_equations

Repository for the Universal Differential Equations for Scientific Machine Learning paper, describing a computational basis for high performance SciML
https://arxiv.org/abs/2001.04385
MIT License
220 stars 59 forks source link

training method for multiple neural-nets in universal ODEs #31

Closed yewalenikhil65 closed 3 years ago

yewalenikhil65 commented 3 years ago

what is the way of training multiple neural-nets in universal ODEs . Do we train them together (if yes , how ?) or do we train them subsequently ?


NN_1 = FastChain(FastDense(2,32,tanh), FastDense(32,32,tanh),FastDense(32,1)) 
p1 = initial_params(NN_1)
NN_2 = FastChain(FastDense(2,32,tanh), FastDense(32,32,tanh),FastDense(32,1)) 
p2 = initial_params(NN_1)

function dudt_(u, p,t)
    x, y, z = u
    z1 = NN_1([x,y],p1)
    z2 = NN_2([y,z],p2)
    [p_[1]*x + z[1],
    -p_[3]*y + z[2],
     p_[4]*z - z[1] - z[2]
      ]
end # where p_[1], p_[3],p_[4] are known parameters passed through p

prob_nn = ODEProblem(dudt_,u0, tspan, p)
sol_nn = solve(prob_nn)

function predict()  # etc..etc
function loss()

How do I update/ train NN_1 and NN_2 together using say sci_ml if thats how I should proceed ?

ChrisRackauckas commented 3 years ago

p = [p1;p2] and then split in dudt_ via a @view. Let me know if you think this needs a tutorial

ChrisRackauckas commented 3 years ago

I can't seem to transfer this issue so I'm going to close it, but I'm tracking it for a tutorial at https://github.com/SciML/DiffEqFlux.jl/issues/459