brain-research / self-attention-gan

Apache License 2.0
1k stars 174 forks source link

Transpose of the attention map and softmax #17

Closed abelmouhcine closed 5 years ago

abelmouhcine commented 5 years ago

Hello, Should the attention map be transposed? I can't see that in the papers! Also, I think you should use dim=0 in softmax.