junfu1115 / DANet

Dual Attention Network for Scene Segmentation (CVPR2019)
MIT License
2.39k stars 483 forks source link

A question about PAM_Module #64

Closed chen048738 closed 5 years ago

chen048738 commented 5 years ago

in PAM_Module forward :

out = torch.bmm(proj_value, attention.permute(0, 2, 1))

why atttention( B X (HxW) X (HxW) ) need permute(0, 2, 1)?

WangboML commented 4 years ago

I think it was a error, permute(0, 2, 1) should remove

Raazzta commented 4 years ago

@WangboML , if not permute then it will be differrent with what paper say, right?