Closed thomasw21 closed 2 years ago
Great! Please squash those commits into a single one and then we can merge this PR.
I can do this. Though GH usually should have an option to prevent merge commits in settings. I usually have the following deactivated.
Thanks for your contribution! 🚀
torch.autograd.backward
only supports list of tensors: https://pytorch.org/docs/stable/generated/torch.autograd.backward.html . This causes the previous assumption to break asself.res_torch
could be a nested structure.We propose to recursively flatten eveything to be passed to the backward step. We also add a test demonstrating that it works.