Open XinyiYang5 opened 5 years ago
The inplace operation is defined here. I think I did it this way because in 0.4.1, dropout could only be applied to a packed sequence by first unpacking it, applying dropout, and then repacking it. This seems likely to have been fixed in pytorch 1.0...
I was getting the exact same error for torch==1.1.0
Changing
self.emb_dropout = nn.Dropout(0.50, inplace=True)
self.lstm_dropout = nn.Dropout(0.20, inplace=True)
to
self.emb_dropout = nn.Dropout(0.50)
self.lstm_dropout = nn.Dropout(0.20)
fixed it. Just as written by the author above me. Thank you.
Hi, when I ran coref.py file, I encountered a RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation. I've tried pytorch 0.4.1 in the requirements.txt and pytorch 1.0 but they got same error. Could you please look into this? Thanks!