Closed f-dangel closed 4 years ago
Thanks! We did not anticipate that someone would use two quadratic problems without restarting. But this PR fixes this. Great!
Adding a mental note that the behavior is now different from the tensorflow implementation
Background: I fixed the seed of
torch/scipy/numpy/random
and created aquadratic_deep
test problem. Then I performed a forward pass. Then I did exactly the same a second time and expected the result of the forward pass to be the same. This was not true.Reason: Creating two
quadratic_deep
problems results in different Hessians, as the random number generator used to generate them lives on the module level, and I could not 'reset' it.This PR moves the random number generator to the functional level.
Demo:
h1
is different fromh2
h1
is equal toh2