Closed fedebotu closed 2 years ago
Thanks for the updates! Are the gradient tests passing with this? I think replacing the Function with the Module will make it so the right backward pass doesn't get called (here) and will make the derivatives silently wrong
Here is a changelog of the fixes:
LQRStep
fromautograd.Functional
tonn.Module
for compatibility (more info about legacy autograd function with non-static method here and here)save_for_backward
(not needed innn.Module
apparently since it is using the Functional API for saving the weights by default, more info here)LQRStep
returns backward and forward trajectory by default