Closed caseykneale closed 4 years ago
@dhairyagandhi96 do you know what this could be? Sounds like it could be a Flux.train!
thing not really DiffEqFlux related.
Tried rolling back Flux, Zygote, OrdinaryDiffEq versions and no luck. I think I may have upgraded to Julia 1.4.2 back when this used to work. Worried it might be a Base bug? I mean an Array is disappearing. Still trying to debug but no luck...
It seems something is going wrong in the gradient. gradient(() -> loss(x, y), Flux.params( nn, nn_ode.p, fc ))
is causing the issue.
I was able to resolve the issue by changing the tstep
to (0.0, 1.0)
instead of (0.0f0)
.
@caseykneale Apart from that I am not sure if it is intended, but in the code x
has a batch size of 20 and y
has a batch size of 1.
Oh yes that's a mistake on my part... I forgot that was the tstep arg.
Looks like I'm missing a transpose too. This resolved the issue. I'm okay if this gets closed :).
tried working through the basic supervised example, and getting a really strange error related to the output from the NNODE layer. Pretty peculiar, when the model is called from Flux the data disappears? When called outside of
Train!
its all fine. No clue where this bug ends up - might be Zygote?