NWC-CUAHSI-Summer-Institute / LGAR-py

LGAR in python/torch
MIT License
4 stars 0 forks source link

Differentiable parameter learning fix #11

Closed taddyb closed 1 year ago

taddyb commented 1 year ago

What was done:

Testing:

Notes/Debugging writeups:

Bug:

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after 
they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or 
autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or 
if you need to access saved tensors after calling backward.

Bug in real person speak:

The fix: