Closed Sleort closed 1 year ago
However, doing this triggers a bunch of the following warnings (errors?):
They are just warnings. They are harmless but it's to tell you an optimization (i.e. using Enzyme) has been disabled. But it should still automatically use ReverseDiffVJP in this example (with tape compilation), so I presume it just runs fine?
This warning was added recently to help find out where extra performance optimizations are turned off. The warning test is supposed to be benign, if you have any ideas for how we can improve the text let us know.
Got it. Yeah, sure, I got the expected output. It was just surprising that these warnings suddenly showed up when I changed the model in the way described above. There were no warnings in the first place...
Also, the warnings were a bit confusing to me, as I (thought I) used Zygote, not Enzyme, for AD in this example. Basically, the whole warning text threw me kind of off (the references to Enzyme, all the technical details following "Enzyme execution failed", the message "You may be using a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/#Activity-of-temporary-storage). If not, please open an issue [...]", etc.). Maybe it would help if you, in the beginning of the warning, more clearly stated that/when the following text can be ignored? I.e.: "If you are not using Enzyme for automatic differentiation, the following can be ignored." Or something like that?
Also, the warnings were a bit confusing to me, as I (thought I) used Zygote, not Enzyme, for AD in this example.
The adjoint for Zygote in this case is defined to solve the adjoint ODE equation, which is setup to try Enzyme in the VJP equations, and fallback to something slower if required.
Maybe it would help if you, in the beginning of the warning, more clearly stated that/when the following text can be ignored? I.e.: "If you are not using Enzyme for automatic differentiation, the following can be ignored." Or something like that?
https://github.com/SciML/SciMLSensitivity.jl/pull/900 should make it more clear.
Closing due to nicer warning from https://github.com/SciML/SciMLSensitivity.jl/pull/900
(I hope this is the right place to report this bug/peculiar behavior)
I was playing around with the missing physics tutorial in the documentation when I came across this (to an outsider) strange behavior:
If I add a third (dummy) component to the
u
-vector in the Lotka-Volterra equations, like this:and update
u0
toand the neural network
U
input and output dimensions to 3:and make the hybrid model I want to learn be
then the solution, if we ignore the third dimension, should be the same as the original one, right? However, doing this triggers a bunch of the following warnings (errors?):
when I am trying to train the network in the hybrid model.
Is this a bug? If not, an explanation in the tutorial would be helpful...