Closed chen048738 closed 5 years ago
in PAM_Module forward :
out = torch.bmm(proj_value, attention.permute(0, 2, 1))
why atttention( B X (HxW) X (HxW) ) need permute(0, 2, 1)?
I think it was a error, permute(0, 2, 1) should remove
@WangboML , if not permute then it will be differrent with what paper say, right?
in PAM_Module forward :
out = torch.bmm(proj_value, attention.permute(0, 2, 1))
why atttention( B X (HxW) X (HxW) ) need permute(0, 2, 1)?