Closed robvanvolt closed 3 years ago
Thanks for your interest in this. You can find the inplace operation in LeakyRelu
in Residual block, line. The code looks like this:
self.resblock = nn.Sequential(
nn.Conv2d(in_channels, out_channels, kernel_size=3, padding=1, bias=False),
nn.BatchNorm2d(out_channels),
nn.LeakyReLU(inplace=True),
nn.Conv2d(out_channels, out_channels, kernel_size=1, bias=False),
nn.BatchNorm2d(out_channels),
)
In all the latest experiments I have not used the Residual block because it did not add any specific improvement that can be directly attributed to that. More information on inplace=True
is this discussion.
Hope this helps.
Thank you for the quick response!
I tried "add_residual = False" and commenting out the complete ResidualLayer, but still the same error message appears.
Unfortunately, the SoftmaxBackward is still expecting version 0 as above.
Do you have another idea to solve the problem?
Since this is talking about a tensor with shape [64, 3000, 16, 16]
it must be referring to the quantised image. So this must be the forward()
method in class VQVAE_v3
. here
Thank you very much, I was finally able to find the inplace operation:
softmax = softmax.scatter(1, torch.argmax(softmax, dim = 1).unsqueeze(1), 1)
instead of
softmax = softmax.scatter_(1, torch.argmax(softmax, dim = 1).unsqueeze(1), 1)
fixes the inplace warning! :-)
Really cool project that I tried to recreate. Unfortunately, after testing, the discrete_vae.py gives the following error because of on inplace operation I cannot find:
Could you give me a hint, where the inplace operation might be? Running this code on windows on python 3.8.7 64-bit if it helps!