Open CryptoSalamander opened 1 year ago
I have simply solved it as follows:
153 logit_scale = torch.clamp(self.logit_scale, max=torch.log(torch.tensor(1. / 0.01)).to(self.logit_scale.get_device())).exp()
I solved it as follows:
A fix to this problem would be very useful.
Since
torch.clamp
was updated in 1.12.0, the latest version of Pytorch,torch.clamp
's min, max argument should be loaded on same device with input tensor. https://github.com/pytorch/pytorch/pull/77035I got an error with PyTorch 1.12.0 in this line, https://github.com/microsoft/Swin-Transformer/blob/b720b4191588c19222ccf129860e905fb02373a7/models/swin_transformer_v2.py#L156
Error :
In 1.11.0 this line works without problems because there was no argument-type promotion before 1.12.0! but now, guess it should be fixed.