Open ahof1704 opened 2 years ago
@ahof1704 Hi, Im having the same problem, could you figure out how to solve it? Thanks!
Unfortunately no. I just moved to start using scipy for linear interpolation.
As far as now, I understood that this is due to the batch gradient computation problem that Torch.Autograd.grad has. You can fix it by either using output.sum(dim=0) or calculating the Jacobian(this one is a little tricky)
Sorry for my slow reactivity on this. I would be glad to have some PR if you round a way
I also run into the same problem. The backward() function just pauses for a long time and then pops such error info. Any solutions? Thanks!
Hi,
Firts of all, thank you for sharing your code. I would like to use your interpolation during training, but unfortunately, I get the following error message during the backward
Here is the snippet showing how the interpolation takes place
Any ideas about the cause of the problem? Thanks!