Open vaib-saxena opened 4 years ago
I face the same problem.
Here's the old vs the new method.
https://discuss.pytorch.org/t/custom-autograd-function-must-it-be-static/14980
Hello, I had the same error. It seems that PyTorch removed the backward compatibility, for the old non-static methods of torch.autograd.Function
, in version 1.5.0. I downgraded PyTorch to 1.4.0, and it's working.
Hi all,
I am currently doing a small project on image captioning. I came across QRNN and thought of replacing LSTM with QRNN. Everything was working fine with LSTM with longer training times but as soon as I replace LSTM with QRNN, I am getting this error.
Legacy autograd function with non-static forward method is deprecated. Please use new-style autograd function with static forward method. (Example: https://pytorch.org/docs/stable/autograd.html#torch.autograd.Function)
Even on running the sample code provided on this repo I am getting the same above error. import torch from torchqrnn import QRNN
seq_len, batch_size, hidden_size = 7, 20, 256 size = (seq_len, batch_size, hidden_size) X = torch.autograd.Variable(torch.rand(size), requires_grad=True).cuda()
qrnn = QRNN(hidden_size, hidden_size, num_layers=2, dropout=0.4) qrnn.cuda() output, hidden = qrnn(X)
print(output.size(), hidden.size())
RUntime error:Legacy autograd function with non-static forward method is deprecated. Please use new-style autograd function with static forward method. (Example: https://pytorch.org/docs/stable/autograd.html#torch.autograd.Function)
Please tell me how to get rid of this error. Thanks