Closed shujunyy123 closed 2 years ago
Sry, I cannot get you point. Could you describe your problem clearer?
(x, torch.cat([x_global, x], dim=1)) != (x, x) In debug mode, there’s a error .
./models/base/selfattention.py stands for the self-attention-like structure, not self-attention in a narrow sense.
imagelevel.py : 47: feats_il = self.correlate_net(x, torch.cat([x_global, x], dim=1))
isanet.py: 47:context = super(SelfAttentionBlock, self).forward(x, x)
is there any problem? bug?