RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after
they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or
autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or
if you need to access saved tensors after calling backward.
Bug in real person speak:
You have parameters that had a gradient passed through them that were not reset
Make sure that you free all gradients, and reset intermediate params, before backpropagating a second time
The fix:
The self.soil_df and self.c attribute variables inside of the dpLGAR().__init__() function were not being updated after backpropagation.
These variables rely on updated Van Genuchten parameters to be updated!
moved this initialization into the set_internal_states() function
Made sure to reset global parameters too, and the self.ponded_max_depth clone
What was done:
data/
dirrunoff = torch.clamp(ponded_depth - infiltration, min=0)
Testing:
Notes/Debugging writeups:
Bug:
Bug in real person speak:
The fix:
self.soil_df
andself.c
attribute variables inside of thedpLGAR().__init__()
function were not being updated after backpropagation.set_internal_states()
functionself.ponded_max_depth
clone