Closed Varal7 closed 3 years ago
I think this is fixed by https://github.com/pytorch/pytorch/pull/56017 if you use PyTorch nightly. Could you double check?
The bug is still there as of commit 935057fc7464d0df6741ffc24d5aed3131533073 Author: Lily Johnson lillianjohnson@fb.com Date: Tue Jun 8 08:01:01 2021 -0700
Ho yes, my bad. These c++ tensors are temporary as well. I guess your SavedVariable improvements will avoid this problem.
Ho yes, my bad. These c++ tensors are temporary as well. I guess your SavedVariable improvements will avoid this problem.
Another quick fix could also be just to save a reference ourselves, just like what we already have with seen
w/ non custom functions
In general, I think we should do that yes. use of id()
is only valid as long as the PyObject is alive.
In the following example, the saved variable for each
FixedGradientFunctionBackward
Node should be different, but they are merged into a singledot.node
https://colab.research.google.com/drive/1MyOV58n6oex9X_Z5HSRf-Gb87dclYNKN?usp=sharing